Sample records for uniquac model application

  1. Equilibrium study for ternary mixtures of biodiesel

    NASA Astrophysics Data System (ADS)

    Doungsri, S.; Sookkumnerd, T.; Wongkoblap, A.; Nuchitprasittichai, A.

    2017-11-01

    The liquid-liquid equilibrium (LLE) data for the ternary mixtures of methanol + fatty acid methyl ester (FAME) + palm oil and FAME + palm oil + glycerol at various temperatures from 35 to 55°C, the tie lines and binodial curves were also investigated and plotted in the equilibrium curve. The experimental results showed that the binodial curves of methanol + FAME + palm oil depended significantly with temperature while the binodial curves of FAME + palm oil + glycerol illustrated insignificant change with temperatures. The interaction parameters between liquid pair obtained for NRTL (Nonrandom Two-Liquid) and UNIQUAC (Universal Quasi-Chemical Theory) models from the experimental data were also investigated. It was found that the correlated parameters of UNIQUAC model for system of FAME + palm oil + glycerol, denoted as a13 and a31, were 580.42K and -123.69K, respectively, while those for system of methanol + FAME + palm oil, denoted as a42 and a24, were 71.48 K and 965.57K, respectively. The ternary LLE data reported here would be beneficial for engineers and scientists to use for prediction of yield and purity of biodiesel for the production. The UNIQUAC model agreed well with the experimental data of ternary mixtures of biodiesel.

  2. Liquid-liquid equilibria for the ternary systems sulfolane + octane + benzene, sulfolane + octane + toluene and sulfolane + octane + p-xylene

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S.; Kim, H.

    1995-03-01

    Sulfolane is widely used as a solvent for the extraction of aromatic hydrocarbons. Ternary phase equilibrium data are essential for the proper understanding of the solvent extraction process. Liquid-liquid equilibrium data for the systems sulfolane + octane + benzene, sulfolane + octane + toluene and sulfolane + octane + p-xylene were determined at 298.15, 308.15, and 318.15 K. Tie line data were satisfactorily correlated by the Othmer and Tobias method. The experimental data were compared with the values calculated by the UNIQUAC and NRTL models. Good quantitative agreement was obtained with these models. However, the calculated values based on themore » NRTL model were found to be better than those based on the UNIQUAC model.« less

  3. Isobaric vapor-liquid equilibria for binary systems α-phenylethylamine + toluene and α-phenylethylamine + cyclohexane at 100 kPa

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoru; Gao, Yingyu; Ban, Chunlan; Huang, Qiang

    2016-09-01

    In this paper the results of the vapor-liquid equilibria study at 100 kPa are presented for two binary systems: α-phenylethylamine(1) + toluene (2) and (α-phenylethylamine(1) + cyclohexane(2)). The binary VLE data of the two systems were correlated by the Wilson, NRTL, and UNIQUAC models. For each binary system the deviations between the results of the correlations and the experimental data have been calculated. For the both binary systems the average relative deviations in temperature for the three models were lower than 0.99%. The average absolute deviations in vapour phase composition (mole fractions) and in temperature T were lower than 0.0271 and 1.93 K, respectively. Thermodynamic consistency has been tested for all vapor-liquid equilibrium data by the Herrington method. The values calculated by Wilson and NRTL equations satisfied the thermodynamics consistency test for the both two systems, while the values calculated by UNIQUAC equation didn't.

  4. Multiphase, multicomponent phase behavior prediction

    NASA Astrophysics Data System (ADS)

    Dadmohammadi, Younas

    Accurate prediction of phase behavior of fluid mixtures in the chemical industry is essential for designing and operating a multitude of processes. Reliable generalized predictions of phase equilibrium properties, such as pressure, temperature, and phase compositions offer an attractive alternative to costly and time consuming experimental measurements. The main purpose of this work was to assess the efficacy of recently generalized activity coefficient models based on binary experimental data to (a) predict binary and ternary vapor-liquid equilibrium systems, and (b) characterize liquid-liquid equilibrium systems. These studies were completed using a diverse binary VLE database consisting of 916 binary and 86 ternary systems involving 140 compounds belonging to 31 chemical classes. Specifically the following tasks were undertaken: First, a comprehensive assessment of the two common approaches (gamma-phi (gamma-ϕ) and phi-phi (ϕ-ϕ)) used for determining the phase behavior of vapor-liquid equilibrium systems is presented. Both the representation and predictive capabilities of these two approaches were examined, as delineated form internal and external consistency tests of 916 binary systems. For the purpose, the universal quasi-chemical (UNIQUAC) model and the Peng-Robinson (PR) equation of state (EOS) were used in this assessment. Second, the efficacy of recently developed generalized UNIQUAC and the nonrandom two-liquid (NRTL) for predicting multicomponent VLE systems were investigated. Third, the abilities of recently modified NRTL model (mNRTL2 and mNRTL1) to characterize liquid-liquid equilibria (LLE) phase conditions and attributes, including phase stability, miscibility, and consolute point coordinates, were assessed. The results of this work indicate that the ϕ-ϕ approach represents the binary VLE systems considered within three times the error of the gamma-ϕ approach. A similar trend was observed for the for the generalized model predictions using quantitative structure-property parameter generalizations (QSPR). For ternary systems, where all three constituent binary systems were available, the NRTL-QSPR, UNIQUAC-QSPR, and UNIFAC-6 models produce comparable accuracy. For systems where at least one constituent binary is missing, the UNIFAC-6 model produces larger errors than the QSPR generalized models. In general, the LLE characterization results indicate the accuracy of the modified models in reproducing the findings of the original NRTL model.

  5. Description of Adsorption in Liquid Chromatography under Nonideal Conditions.

    PubMed

    Ortner, Franziska; Ruppli, Chantal; Mazzotti, Marco

    2018-05-15

    A thermodynamically consistent description of binary adsorption in reversed-phase chromatography is presented, accounting for thermodynamic nonidealities in the liquid and adsorbed phases. The investigated system involves the adsorbent Zorbax 300SB-C18, as well as phenetole and 4- tert-butylphenol as solutes and methanol and water as inert components forming the eluent. The description is based on adsorption isotherms, which are a function of the liquid-phase activities, to account for nonidealities in the liquid phase. Liquid-phase activities are calculated with a UNIQUAC model established in this work, based on experimental phase equilibrium data. The binary interaction in the adsorbed phase is described by the adsorbed solution theory, assuming an ideal (ideal adsorbed solution theory) or real (real adsorbed solution theory) adsorbed phase. Implementation of the established adsorption model in a chromatographic code achieves a quantitative description of experimental elution profiles, with feed compositions exploiting the entire miscible region, and involving a broad range of different eluent compositions (methanol/water). The quantitative agreement of the model and experimental data serves as a confirmation of the underlying physical (thermodynamic) concepts and of their applicability to a broad range of operating conditions.

  6. Protein solubility modeling

    NASA Technical Reports Server (NTRS)

    Agena, S. M.; Pusey, M. L.; Bogle, I. D.

    1999-01-01

    A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.

  7. Phase diagrams for the system water/butyric acid/propylene carbonate at T = 293.2-313.2 K and p = 101.3 kPa

    NASA Astrophysics Data System (ADS)

    Shekarsaraee, Sina; Nahzomi, Hossein Taherpour; Nasiri-Touli, Elham

    2017-11-01

    Phase diagrams for the system water/butyric acid/propylene carbonate were plotted at T = 293.2, 303.2, 313.2 K and p = 101.3 kPa. Acidimetric titration and refractive index methods were used to determine tie-line data. Solubility data revealed that the studied system exhibits type-1 behavior of liquid-liquid equilibrium. The experimental data were regressed and acceptably correlated using the UNIQUAC and NRTL models. As a result, propylene carbonate is a suitable separating agent for aqueous mixture of butyric acid.

  8. Liquid-liquid equilibria for water + ethanol + 2-methylpropyl ethanoate and water + ethanol + 1,2-dibromoethane at 298. 15 K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solimo, H.N.; Barnes de Arreguez, N.G.

    1994-01-01

    Liquid-liquid equilibrium, distribution coefficients, and selectivities of the systems water + ethanol + 2-methylpropyl ethanoate or + 1,2-dibromoethane have been determined at 298.15 K in order to evaluate their suitability in preferentially extracting ethanol from aqueous solution. Tie-line data were satisfactorily correlated by the Othmer and Tobias method, and the plait point coordinates for the two systems were estimated. The experimental data was compared with the values calculated by the NRTL and UNIQUAC models. The water + ethanol + 2-methylpropyl ethanoate system was also compared with the values predicted by the UNIFAC model. Poor qualitative agreement was obtained with thesemore » models. From the experimental results, they can conclude that both solvents are inappropriate for ethanol extraction processes from aqueous solutions.« less

  9. Ternary liquid-liquid equilibrium for eugenol + tert-butanol + water system at 303.15 and 323.15K and atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Sucipto, Retno Kumala Hesti; Kuswandi, Wibawa, Gede

    2017-05-01

    The objective of this study was to determine ternary liquid-liquid equilibrium for eugenol + tert-butanol + water system at 303.15 and 323.15K and atmospheric pressure. Using 25 mL equilibrium cell equipped jacketted water connected to water bath to maintain equilibrium temperature constant. The procedure of this experiment was conducted by inserting mixture of eugenol + tert-butanol + water system at certain composition into equilibrium cell. The solution was stirred for 4 hours and then was allowed for 20 hours in order to separate aqueous and organic phases completely. The temperature equilibrium cell of and the atmosphere pressure were recorded as equilibrium temperature and pressure for each measurenment. The equilibrium compositions of each phase were analyzed using Gas Chromatography. The experimental data obtained in this work were correlated with NRTL and UNIQUAC models with root mean square deviation between esperimental and calculated equilibrium compositions of 0.03% and 0.04% respectively.

  10. Volumetric Properties, Viscosities, and Refractive Indices of the Binary Systems 1-Butanol + PEG 200, + PEG 400, and + TEGDME

    NASA Astrophysics Data System (ADS)

    Živković, N.; Šerbanović, S.; Kijevčanin, M.; Živković, E.

    2013-06-01

    Densities, viscosities, and refractive indices of three binary systems consisting of 1-butanol with polyethylene glycols of different molecular weights (PEG 200 and PEG 400) or tetraethylene glycol dimethyl ether (TEGDME) were measured at ten temperatures (288.15, 293.15, 298.15, 303.15, 308.15, 313.15, 318.15, 323.15, 328.15, and 333.15) K and atmospheric pressure. Densities of the selected binary mixtures were measured with an Anton Paar DMA 5000 digital vibrating U-tube densimeter, refractive indices were measured with an automatic Anton Paar RXA-156 refractometer, while for viscosity measurements, a digital Stabinger SVM 3000/G2 viscometer was used. From these data, excess molar volumes were calculated and fitted to the Redlich-Kister equation. The obtained results have been analyzed in terms of specific molecular interactions and mixing behavior between mixture components, as well as the influence of temperature on them. Viscosity data were also correlated by Grunberg-Nissan, Eyring-UNIQUAC, three-body McAlister, and Eyring-NRTL models.

  11. Separating Iso-Propanol-Toluene mixture by azeotropic distillation

    NASA Astrophysics Data System (ADS)

    Iqbal, Asma; Ahmad, Syed Akhlaq

    2018-05-01

    The separation of Iso-Propanol-Toluene azeotropic mixture using Acetone as an entrainer has been simulated on Aspen Plus software package using rigorous methods. Calculations of the vapor-liquid equilibrium for the binary system are done using UNIQUAC-RK model which gives a good agreement with the experimental data reported in literature. The effects of the Reflux ratio (RR), distillate-to-feed molar ratio (D/F), feed stage, solvent feed stage, Total no. of stages and solvent feed temperature on the product purities and recoveries are studied to obtain their optimum values that give the maximum purity and recovery of products. The configuration consists of 20 theoretical stages with an equimolar feed of binary mixture. The desired separation of binary mixture has been achieved at the feed stage and an entrainer feeding stage of 15 and 12 respectively with the reflux ratios of 2.5 and 4.0, and D/F ratio of 0.75 and 0.54 respectively in the two columns. The simulation results thus obtained are useful to setup the optimal column configuration of the azeotropic distillation process.

  12. Modeling of the phase equilibria of polystyrene in methylcyclohexane with semi-empirical quantum mechanical methods I.

    PubMed

    Wilczura-Wachnik, Hanna; Jónsdóttir, Svava Osk

    2003-04-01

    A method for calculating interaction parameters traditionally used in phase-equilibrium computations in low-molecular systems has been extended for the prediction of solvent activities of aromatic polymer solutions (polystyrene+methylcyclohexane). Using ethylbenzene as a model compound for the repeating unit of the polymer, the intermolecular interaction energies between the solvent molecule and the polymer were simulated. The semiempirical quantum chemical method AM1, and a method for sampling relevant internal orientations for a pair of molecules developed previously were used. Interaction energies are determined for three molecular pairs, the solvent and the model molecule, two solvent molecules and two model molecules, and used to calculated UNIQUAC interaction parameters, a(ij) and a(ji). Using these parameters, the solvent activities of the polystyrene 90,000 amu+methylcyclohexane system, and the total vapor pressures of the methylcyclohexane+ethylbenzene system were calculated. The latter system was compared to experimental data, giving qualitative agreement. Figure Solvent activities for the methylcylcohexane(1)+polystyrene(2) system at 316 K. Parameters aij (blue line) obtained with the AM1 method; parameters aij (pink line) from VLE data for the ethylbenzene+methylcyclohexane system. The abscissa is the polymer weight fraction defined as y2(x1)=(1mx1)M2/[x1M1+(1mx1)M2], where x1 is the solvent mole fraction and Mi are the molecular weights of the components.

  13. Mutual diffusion of binary liquid mixtures containing methanol, ethanol, acetone, benzene, cyclohexane, toluene, and carbon tetrachloride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guevara-Carrion, Gabriela; Janzen, Tatjana; Muñoz-Muñoz, Y. Mauricio

    Mutual diffusion coefficients of all 20 binary liquid mixtures that can be formed out of methanol, ethanol, acetone, benzene, cyclohexane, toluene, and carbon tetrachloride without a miscibility gap are studied at ambient conditions of temperature and pressure in the entire composition range. The considered mixtures show a varying mixing behavior from almost ideal to strongly non-ideal. Predictive molecular dynamics simulations employing the Green-Kubo formalism are carried out. Radial distribution functions are analyzed to gain an understanding of the liquid structure influencing the diffusion processes. It is shown that cluster formation in mixtures containing one alcoholic component has a significant impactmore » on the diffusion process. The estimation of the thermodynamic factor from experimental vapor-liquid equilibrium data is investigated, considering three excess Gibbs energy models, i.e., Wilson, NRTL, and UNIQUAC. It is found that the Wilson model yields the thermodynamic factor that best suits the simulation results for the prediction of the Fick diffusion coefficient. Four semi-empirical methods for the prediction of the self-diffusion coefficients and nine predictive equations for the Fick diffusion coefficient are assessed and it is found that methods based on local composition models are more reliable. Finally, the shear viscosity and thermal conductivity are predicted and in most cases favorably compared with experimental literature values.« less

  14. Ether bond effects in quaternary ammonium and phosphonium ionic liquid-propanol solutions

    NASA Astrophysics Data System (ADS)

    Kishimura, Hiroaki; Kohki, Erica; Nakada, Ayumu; Tamatani, Kentaro; Abe, Hiroshi

    2018-03-01

    The liquid-liquid equilibria (LLE) of quaternary ammonium and phosphonium ionic liquid (IL)-propanol solutions were examined. The ILs contained cations with or without ether bonds; the anion in all the ILs was bis(trifluoromethanesulfonyl)imide (TFSI-). The cations without ether groups are tributylmethyl ammonium (N4441+), triethylpentyl phosphonium (P2225+), triethyloctyl phosphonium (P2228+), and tributylmethyl phosphonium (P4441+). The cations containing ether groups are N,N-diethyl-N-methyl-N-(2-methoxyethyl) ammonium, (N122(2O1)+), triethyl(methoxymethyl) phosphonium (P222(1O1)+), and triethyl(2-methoxyethyl) phosphonium, (P222(2O1)+). Propanol isomer effect was observed to affect the LLEs, reflecting the geometrical factors and hydrophobicities of 1-propanol and 2-propanol. According to Raman spectroscopy, the TFSI- anion conformers in the mixtures were altered in the presence of ether bonds in the cations. The universal quasichemical (UNIQUAC) interaction parameters are consistent with significant factors affecting IL-propanol solutions, such as the type of cation (ammonium or phosphonium), ether bonds, TFSI- conformers, and propanol isomer effects.

  15. Physicochemical properties and solubility of alkyl-(2-hydroxyethyl)-dimethylammonium bromide.

    PubMed

    Domańska, Urszula; Bogel-Łukasik, Rafał

    2005-06-23

    Quaternary ammonium salts, which are precursors of ionic liquids, have been prepared from N,N-dimethylethanolamine as a substrate. The paper includes specific basic characterization of synthesized compounds via the following procedures: nuclear magnetic resonance (NMR) and Fourier transform infrared (FTIR) spectra, water content, mass spectroscopy (MS) spectra, temperatures of decompositions, basic thermodynamic properties of pure ionic liquids (the melting point, enthalpy of fusion, enthalpy of solid-solid phase transition, glass transition), and the difference in the solute heat capacity between the liquid and solid at the melting temperature determined by differential scanning calorimetry (DSC). The (solid + liquid) phase equilibria of binary mixtures containing (quaternary ammonium salt + water, or + 1-octanol) has been measured by a dynamic method over wide range of temperatures, from 230 K to 560 K. These data were correlated by means of the UNIQUAC ASM and modified nonrandom two-liquid NRTL1 equations utilizing parameters derived from the (solid + liquid) equilibrium. The partition coefficient of ionic liquid in the 1-octanol/water binary system has been calculated from the solubility results. Experimental partition coefficients (log P) were negative at three temperatures.

  16. Phase behaviour, interactions, and structural studies of (amines+ionic liquids) binary mixtures.

    PubMed

    Jacquemin, Johan; Bendová, Magdalena; Sedláková, Zuzana; Blesic, Marijana; Holbrey, John D; Mullan, Claire L; Youngs, Tristan G A; Pison, Laure; Wagner, Zdeněk; Aim, Karel; Costa Gomes, Margarida F; Hardacre, Christopher

    2012-05-14

    We present a study on the phase equilibrium behaviour of binary mixtures containing two 1-alkyl-3-methylimidazolium bis{(trifluoromethyl)sulfonyl}imide-based ionic liquids, [C(n)mim] [NTf(2)] (n=2 and 4), mixed with diethylamine or triethylamine as a function of temperature and composition using different experimental techniques. Based on this work, two systems showing an LCST and one system with a possible hourglass shape are measured. Their phase behaviours are then correlated and predicted by using Flory-Huggins equations and the UNIQUAC method implemented in Aspen. The potential of the COSMO-RS methodology to predict the phase equilibria was also tested for the binary systems studied. However, this methodology is unable to predict the trends obtained experimentally, limiting its use for systems involving amines in ionic liquids. The liquid-state structure of the binary mixture ([C(2)mim] [NTf(2)]+diethylamine) is also investigated by molecular dynamics simulation and neutron diffraction. Finally, the absorption of gaseous ethane by the ([C(2)mim][NTf(2)]+diethylamine) binary mixture is determined and compared with that observed in the pure solvents. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Studying of drug solubility in water and alcohols using drug-ammonium ionic liquid-compounds.

    PubMed

    Halayqa, Mohammad; Pobudkowska, Aneta; Domańska, Urszula; Zawadzki, Maciej

    2018-01-01

    Synthesis of three mefenamic acid (MEF) derivatives - ionic liquid compounds composed of MEF in an anionic form and ammonium cation (choline, MEF1), or {di(2-hydroxyethyl)dimethyl ammonium (MEF2)}, or {tri(2-hydroxyethyl)methyl ammonium compound (MEF3)} is presented. The basic thermal properties of pure compounds i.e. fusion temperatures, and the enthalpy of fusion of these compounds have been measured with differential scanning microcalorimetry technique (DSC). Molar volumes have been calculated with the Barton group contribution method. The solubilities of MEF1, MEF2 and MEF3 using the dynamic method were measured at constant pH in a range of temperature from (290 to 370) K in three solvents: water, ethanol and 1-octanol. The experimental solubility data have been correlated by means of three commonly known G E equations: the Wilson, NRTL and UNIQUAC with the assumption that the systems studied here present simple eutectic behaviour. The activity coefficients of pharmaceuticals at saturated solutions in each binary mixture were calculated from the experimental data. The formation of MEF-ionic liquid compounds greatly increases the solubility in water in comparison with pure MEF or complexes with 2-hydroxypropyl-β-cyclodextrin. The development of these compounds formulations will assist in medication taking into account oral solid or gel medicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Physico-Chemical Properties and Phase Behaviour of Pyrrolidinium-Based Ionic Liquids

    PubMed Central

    Domańska, Urszula

    2010-01-01

    A review of the relevant literature on 1-alkyl-1-methylpyrrolidinium-based ionic liquids has been presented. The phase diagrams for the binary systems of {1-ethyl-1-methylpyrrolidinium trifluoromethanesulfonate (triflate) [EMPYR][CF3SO3] + water, or + 1-butanol} and for the binary systems of {1-propyl-1-methylpyrrolidinium trifluoromethanesulfonate (triflate) [PMPYR][CF3SO3] + water, or + an alcohol (1-butanol, 1-hexanol, 1-octanol, 1-decanol)} have been determined at atmospheric pressure using a dynamic method. The influence of alcohol chain length was discussed for the [PMPYR][CF3SO3]. A systematic decrease in the solubility was observed with an increase of the alkyl chain length of an alcohol. (Solid + liquid) phase equilibria with complete miscibility in the liquid phase region were observed for the systems involving water and alcohols. The solubility of the ionic liquid increases as the alkyl chain length on the pyrrolidinium cation increases. The correlation of the experimental data has been carried out using the Wilson, UNIQUAC and the NRTL equations. The phase diagrams reported here have been compared to the systems published earlier with the 1-alkyl-1-methylpyrrolidinium-based ionic liquids. The influence of the cation and anion on the phase behaviour has been discussed. The basic thermal properties of pure ILs, i.e., melting temperature and the enthalpy of fusion, the solid-solid phase transition temperature and enthalpy have been measured using a differential scanning microcalorimetry technique. PMID:20480044

  19. Effects of initial saturation on properties modification and displacement of tetrachloroethene with aqueous isobutanol.

    PubMed

    Boyd, Glen R; Ocampo-Gómez, Ana M; Li, Minghua; Husserl, Johana

    2006-11-20

    Packed column experiments were conducted to study effects of initial saturation of tetrachloroethene (PCE) in the range of 1.0-14% pore volume (PV) on mobilization and downward migration of the non-aqueous phase liquid (NAPL) product upon contact with aqueous isobutanol ( approximately 10 vol.%). This study focused on the consequences of swelling beyond residual saturation. Columns were packed with mixtures of neat PCE, water and glass beads and waterflooded to establish a desired homogeneous residual saturation, and then flooded with aqueous isobutanol under controlled hydraulic conditions. Results showed a critical saturation of approximately 8% PV for these packed column experimental conditions. At low initial PCE saturations (<8% PV), experimental results showed reduced risk of NAPL-product migration upon contact with aqueous isobutanol. At higher initial PCE saturations (>8% PV), results showed NAPL-product mobilization and downward migration which was attributed to interfacial tension (IFT) reduction, swelling of the NAPL-product, and reduced density modification. Packed column results were compared with good agreement to theoretical predictions of NAPL-product mobilization using the total trapping number, N(T). In addition to the packed column study, preliminary batch experiments were conducted to study the effects of PCE volumetric fraction in the range of 0.5-20% on density, viscosity, and IFT modification as a function of time following contact with aqueous isobutanol ( approximately 10 vol.%). Modified NAPL-product fluid properties approached equilibrium within approximately 2 h of contact for density and viscosity. IFT reduction occurred immediately as expected. Measured fluid properties were compared with good agreement to theoretical equilibrium predictions based on UNIQUAC. Overall, this study demonstrates the importance of initial DNAPL saturation, and the associated risk of downward NAPL-product migration, in applying alcohol flooding for remediation of DNAPL contaminated ground water sites.

  20. 40 CFR 600.512-12 - Model year report.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...

  1. 40 CFR 600.512-12 - Model year report.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...

  2. 40 CFR 600.512-12 - Model year report.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...

  3. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  4. Review of applications for SIMDEUM, a stochastic drinking water demand model with a small temporal and spatial scale

    NASA Astrophysics Data System (ADS)

    Blokker, Mirjam; Agudelo-Vera, Claudia; Moerman, Andreas; van Thienen, Peter; Pieterse-Quirijns, Ilse

    2017-04-01

    Many researchers have developed drinking water demand models with various temporal and spatial scales. A limited number of models is available at a temporal scale of 1 s and a spatial scale of a single home. The reasons for building these models were described in the papers in which the models were introduced, along with a discussion on their potential applications. However, the predicted applications are seldom re-examined. SIMDEUM, a stochastic end-use model for drinking water demand, has often been applied in research and practice since it was developed. We are therefore re-examining its applications in this paper. SIMDEUM's original purpose was to calculate maximum demands in order to design self-cleaning networks. Yet, the model has been useful in many more applications. This paper gives an overview of the many fields of application for SIMDEUM and shows where this type of demand model is indispensable and where it has limited practical value. This overview also leads to an understanding of the requirements for demand models in various applications.

  5. Business model framework applications in health care: A systematic review.

    PubMed

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  6. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  7. Exponential Models of Legislative Turnover. [and] The Dynamics of Political Mobilization, I: A Model of the Mobilization Process, II: Deductive Consequences and Empirical Application of the Model. Applications of Calculus to American Politics. [and] Public Support for Presidents. Applications of Algebra to American Politics. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Units 296-300.

    ERIC Educational Resources Information Center

    Casstevens, Thomas W.; And Others

    This document consists of five units which all view applications of mathematics to American politics. The first three view calculus applications, the last two deal with applications of algebra. The first module is geared to teach a student how to: 1) compute estimates of the value of the parameters in negative exponential models; and draw…

  8. The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.

    PubMed

    van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L

    2016-04-01

    Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.

  9. Modelface: an Application Programming Interface (API) for Homology Modeling Studies Using Modeller Software

    PubMed Central

    Sakhteman, Amirhossein; Zare, Bijan

    2016-01-01

    An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276

  10. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. EPA is releasing a final report describing the evaluation and applications of physiologically based pharmacokinetic (PBPK) models in health risk assessment. This was announced in the September 22 2006 Federal Register Notice.

  11. The use of tetrahedral mesh geometries in Monte Carlo simulation of applicator based brachytherapy dose distributions

    NASA Astrophysics Data System (ADS)

    Paiva Fonseca, Gabriel; Landry, Guillaume; White, Shane; D'Amours, Michel; Yoriyaz, Hélio; Beaulieu, Luc; Reniers, Brigitte; Verhaegen, Frank

    2014-10-01

    Accounting for brachytherapy applicator attenuation is part of the recommendations from the recent report of AAPM Task Group 186. To do so, model based dose calculation algorithms require accurate modelling of the applicator geometry. This can be non-trivial in the case of irregularly shaped applicators such as the Fletcher Williamson gynaecological applicator or balloon applicators with possibly irregular shapes employed in accelerated partial breast irradiation (APBI) performed using electronic brachytherapy sources (EBS). While many of these applicators can be modelled using constructive solid geometry (CSG), the latter may be difficult and time-consuming. Alternatively, these complex geometries can be modelled using tessellated geometries such as tetrahedral meshes (mesh geometries (MG)). Recent versions of Monte Carlo (MC) codes Geant4 and MCNP6 allow for the use of MG. The goal of this work was to model a series of applicators relevant to brachytherapy using MG. Applicators designed for 192Ir sources and 50 kV EBS were studied; a shielded vaginal applicator, a shielded Fletcher Williamson applicator and an APBI balloon applicator. All applicators were modelled in Geant4 and MCNP6 using MG and CSG for dose calculations. CSG derived dose distributions were considered as reference and used to validate MG models by comparing dose distribution ratios. In general agreement within 1% for the dose calculations was observed for all applicators between MG and CSG and between codes when considering volumes inside the 25% isodose surface. When compared to CSG, MG required longer computation times by a factor of at least 2 for MC simulations using the same code. MCNP6 calculation times were more than ten times shorter than Geant4 in some cases. In conclusion we presented methods allowing for high fidelity modelling with results equivalent to CSG. To the best of our knowledge MG offers the most accurate representation of an irregular APBI balloon applicator.

  12. Chemistry Teachers' Knowledge and Application of Models

    ERIC Educational Resources Information Center

    Wang, Zuhao; Chi, Shaohui; Hu, Kaiyan; Chen, Wenting

    2014-01-01

    Teachers' knowledge and application of model play an important role in students' development of modeling ability and scientific literacy. In this study, we investigated Chinese chemistry teachers' knowledge and application of models. Data were collected through test questionnaire and analyzed quantitatively and qualitatively. The result indicated…

  13. DAVE: A plug and play model for distributed multimedia application development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mines, R.F.; Friesen, J.A.; Yang, C.L.

    1994-07-01

    This paper presents a model being used for the development of distributed multimedia applications. The Distributed Audio Video Environment (DAVE) was designed to support the development of a wide range of distributed applications. The implementation of this model is described. DAVE is unique in that it combines a simple ``plug and play`` programming interface, supports both centralized and fully distributed applications, provides device and media extensibility, promotes object reuseability, and supports interoperability and network independence. This model enables application developers to easily develop distributed multimedia applications and create reusable multimedia toolkits. DAVE was designed for developing applications such as videomore » conferencing, media archival, remote process control, and distance learning.« less

  14. An improved task-role-based access control model for G-CSCW applications

    NASA Astrophysics Data System (ADS)

    He, Chaoying; Chen, Jun; Jiang, Jie; Han, Gang

    2005-10-01

    Access control is an important and popular security mechanism for multi-user applications. GIS-based Computer Supported Cooperative Work (G-CSCW) application is one of such applications. This paper presents an improved Task-Role-Based Access Control (X-TRBAC) model for G-CSCW applications. The new model inherits the basic concepts of the old ones, such as role and task. Moreover, it has introduced two concepts, i.e. object hierarchy and operation hierarchy, and the corresponding rules to improve the efficiency of permission definition in access control models. The experiments show that the method can simplify the definition of permissions, and it is more applicable for G-CSCW applications.

  15. Syntheses of the current model applications for managing water and needs for experimental data and model improvements to enhance these applications

    USDA-ARS?s Scientific Manuscript database

    This volume of the Advances in Agricultural Systems Modeling series presents 14 different case studies of model applications to help make the best use of limited water in agriculture. These examples show that models have tremendous potential and value in enhancing site-specific water management for ...

  16. An Application of Unfolding and Cumulative Item Response Theory Models for Noncognitive Scaling: Examining the Assumptions and Applicability of the Generalized Graded Unfolding Model

    ERIC Educational Resources Information Center

    Sgammato, Adrienne N.

    2009-01-01

    This study examined the applicability of a relatively new unidimensional, unfolding item response theory (IRT) model called the generalized graded unfolding model (GGUM; Roberts, Donoghue, & Laughlin, 2000). A total of four scaling methods were applied. Two commonly used cumulative IRT models for polytomous data, the Partial Credit Model and…

  17. A Linguistic Model in Component Oriented Programming

    NASA Astrophysics Data System (ADS)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2016-12-01

    It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.

  18. An overview of topic modeling and its current applications in bioinformatics.

    PubMed

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  19. Modeling Answer Change Behavior: An Application of a Generalized Item Response Tree Model

    ERIC Educational Resources Information Center

    Jeon, Minjeong; De Boeck, Paul; van der Linden, Wim

    2017-01-01

    We present a novel application of a generalized item response tree model to investigate test takers' answer change behavior. The model allows us to simultaneously model the observed patterns of the initial and final responses after an answer change as a function of a set of latent traits and item parameters. The proposed application is illustrated…

  20. Dimensions for hearing-impaired mobile application usability model

    NASA Astrophysics Data System (ADS)

    Nathan, Shelena Soosay; Hussain, Azham; Hashim, Nor Laily; Omar, Mohd Adan

    2017-10-01

    This paper discuss on the dimensions that has been derived for the hearing-impaired mobile applications usability model. General usability model consist of general dimension for evaluating mobile application however requirements for the hearing-impaired are overlooked and often scanted. This led towards mobile application developed for the hearing-impaired are left unused. It is also apparent that these usability models do not consider accessibility dimensions according to the requirement of the special users. This complicates the work of usability practitioners as well as academician that practices research usability when application are developed for the specific user needs. To overcome this issue, dimension chosen for the hearing-impaired are ensured to be align with the real need of the hearing-impaired mobile application. Besides literature studies, requirements for the hearing-impaired mobile application have been identified through interview conducted with hearing-impaired mobile application users that were recorded as video outputs and analyzed using Nvivo. Finally total of 6 out of 15 dimensions gathered are chosen for the proposed model and presented.

  1. Hydrological Simulation Program - FORTRAN (HSPF) Data Formatting Tool (HDFT)

    EPA Science Inventory

    The HSPF data formatting and unit conversion tool has two seperate applications: a web-based application and a desktop application. The tool was developed to aid users in formatting data for HSPF stormwater modeling applications. Unlike traditional HSPF modeling applications, sto...

  2. 75 FR 5355 - Notice of Extension of Comment Period for NUREG-1934, Nuclear Power Plant Fire Modeling...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ..., Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG), Draft Report for Comment AGENCY... 1019195), Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG), Draft Report for Comment... Plant Fire Modeling Application Guide (NPP FIRE MAG)'' is available electronically under ADAMS Accession...

  3. Application of triggered lightning numerical models to the F106B and extension to other aircraft

    NASA Technical Reports Server (NTRS)

    Ng, Poh H.; Dalke, Roger A.; Horembala, Jim; Rudolph, Terence; Perala, Rodney A.

    1988-01-01

    The goal of the F106B Thunderstorm Research Program is to characterize the lightning environment for aircraft in flight. This report describes the application of numerical electromagnetic models to this problem. Topics include: (1) Extensive application of linear triggered lightning to F106B data; (2) Electrostatic analysis of F106B field mill data; (3) Application of subgrid modeling to F106B nose region, including both static and nonlinear models; (4) Extension of F106B results to other aircraft of varying sizes and shapes; and (5) Application of nonlinear model to interaction of F106B with lightning leader-return stroke event.

  4. The Model Life-cycle: Training Module

    EPA Pesticide Factsheets

    Model Life-Cycle includes identification of problems & the subsequent development, evaluation, & application of the model. Objectives: define ‘model life-cycle’, explore stages of model life-cycle, & strategies for development, evaluation, & applications.

  5. Ontology for Life-Cycle Modeling of Water Distribution Systems: Application of Model View Definition Attributes

    DTIC Science & Technology

    2013-06-01

    ER D C/ CE RL C R- 13 -5 Ontology for Life-Cycle Modeling of Water Distribution Systems : Application of Model View Definition...2013 Ontology for Life-Cycle Modeling of Water Distribution Systems : Application of Model View Definition Attributes Kristine K. Fallon, Robert A...interior plumbing systems and the information exchange requirements for every participant in the design. The findings were used to develop an

  6. FloorspaceJS - A New, Open Source, Web-Based Geometry Editor for Building Energy Modeling (BEM): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie

    Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less

  7. Computing Models for FPGA-Based Accelerators

    PubMed Central

    Herbordt, Martin C.; Gu, Yongfeng; VanCourt, Tom; Model, Josh; Sukhwani, Bharat; Chiu, Matt

    2011-01-01

    Field-programmable gate arrays are widely considered as accelerators for compute-intensive applications. A critical phase of FPGA application development is finding and mapping to the appropriate computing model. FPGA computing enables models with highly flexible fine-grained parallelism and associative operations such as broadcast and collective response. Several case studies demonstrate the effectiveness of using these computing models in developing FPGA applications for molecular modeling. PMID:21603152

  8. Usability evaluation of mobile applications; where do we stand?

    NASA Astrophysics Data System (ADS)

    Zahra, Fatima; Hussain, Azham; Mohd, Haslina

    2017-10-01

    The range and availability of mobile applications is expanding rapidly. With the increased processing power available on portable devices, developers are increasing the range of services by embracing smartphones in their extensive and diverse practices. While usability testing and evaluations of mobile applications have not yet touched the accuracy level of other web based applications. The existing usability models do not adequately capture the complexities of interacting with applications on a mobile platform. Therefore, this study aims to presents review on existing usability models for mobile applications. These models are in their infancy but with time and more research they may eventually be adopted. Moreover, different categories of mobile apps (medical, entertainment, education) possess different functional and non-functional requirements thus customized models are required for diverse mobile applications.

  9. The application of single particle hydrodynamics in continuum models of multiphase flow

    NASA Technical Reports Server (NTRS)

    Decker, Rand

    1988-01-01

    A review of the application of single particle hydrodynamics in models for the exchange of interphase momentum in continuum models of multiphase flow is presented. Considered are the equations of motion for a laminar, mechanical two phase flow. Inherent to this theory is a model for the interphase exchange of momentum due to drag between the dispersed particulate and continuous fluid phases. In addition, applications of two phase flow theory to de-mixing flows require the modeling of interphase momentum exchange due to lift forces. The applications of single particle analysis in deriving models for drag and lift are examined.

  10. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  11. JUPITER: Joint Universal Parameter IdenTification and Evaluation of Reliability - An Application Programming Interface (API) for Model Analysis

    USGS Publications Warehouse

    Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.

    2006-01-01

    he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.

  12. Modeling and applications in microbial food safety

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is a scientific and systematic approach to study and describe the recurrent events or phenomena with successful application track for decades. When models are properly developed and validated, their applications may save costs and time. For the microbial food safety concerns, ...

  13. Predictive Microbiology and Food Safety Applications

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is the science of systematic study of recurrent events or phenomena. When models are properly developed, their applications may save costs and time. For microbial food safety research and applications, predictive microbiology models may be developed based on the fact that most ...

  14. NASA's Earth Resources Laboratory - Seventeen years of using remotely sensed satellite data in land applications

    NASA Technical Reports Server (NTRS)

    Cashion, Kenneth D.; Whitehurst, Charles A.

    1987-01-01

    The activities of the Earth Resources Laboratoy (ERL) for the past seventeen years are reviewed with particular reference to four typical applications demonstrating the use of remotely sensed data in a geobased information system context. The applications discussed are: a fire control model for the Olympic National Park; wildlife habitat modeling; a resource inventory system including a potential soil erosion model; and a corridor analysis model for locating routes between geographical locations. Some future applications are also discussed.

  15. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  16. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  17. Building model analysis applications with the Joint Universal Parameter IdenTification and Evaluation of Reliability (JUPITER) API

    USGS Publications Warehouse

    Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.

    2008-01-01

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.

  18. Modeling the Office of Science Ten Year FacilitiesPlan: The PERI Architecture Tiger Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Supinski, B R; Alam, S R; Bailey, D H

    2009-05-27

    The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort to the optimization of key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measuredmore » the performance of these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfill our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less

  19. Modeling the Office of Science Ten Year Facilities Plan: The PERI Architecture Tiger Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Supinski, Bronis R.; Alam, Sadaf; Bailey, David H.

    2009-06-26

    The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort optimizing key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measured the performance ofmore » these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfill our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less

  20. Modeling the Office of Science Ten Year Facilities Plan: The PERI Architecture Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Supinski, Bronis R.; Alam, Sadaf R; Bailey, David

    2009-01-01

    The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort optimizing key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measured the performance ofmore » these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfilll our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less

  1. An Open Simulation System Model for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1995-01-01

    A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.

  2. 12 CFR Appendix B to Part 202 - Model Application Forms

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 2 2014-01-01 2014-01-01 false Model Application Forms B Appendix B to Part... CREDIT OPPORTUNITY ACT (REGULATION B) Pt. 202, App. B Appendix B to Part 202—Model Application Forms 1... appear on the creditor's form. 3. If a creditor uses an appropriate Appendix B model form, or modifies a...

  3. Neural network model for growth of Salmonella serotypes in ground chicken subjected to temperature abuse during cold storage for application in HACCP and risk assessment

    USDA-ARS?s Scientific Manuscript database

    With the advent of commercial software applications, it is now easy to develop neural network models for predictive microbiology applications. However, different versions of the model may be required to meet the divergent needs of model users. In the current study, the commercial software applicat...

  4. Modeling of polymer networks for application to solid propellant formulating

    NASA Technical Reports Server (NTRS)

    Marsh, H. E.

    1979-01-01

    Methods for predicting the network structural characteristics formed by the curing of pourable elastomers were presented; as well as the logic which was applied in the development of mathematical models. A universal approach for modeling was developed and verified by comparison with other methods in application to a complex system. Several applications of network models to practical problems are described.

  5. Taking the mystery out of mathematical model applications to karst aquifers—A primer

    USGS Publications Warehouse

    Kuniansky, Eve L.

    2014-01-01

    Advances in mathematical model applications toward the understanding of the complex flow, characterization, and water-supply management issues for karst aquifers have occurred in recent years. Different types of mathematical models can be applied successfully if appropriate information is available and the problems are adequately identified. The mathematical approaches discussed in this paper are divided into three major categories: 1) distributed parameter models, 2) lumped parameter models, and 3) fitting models. The modeling approaches are described conceptually with examples (but without equations) to help non-mathematicians understand the applications.

  6. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  7. Development and application of air quality models at the U.S. EPA

    EPA Science Inventory

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Resear...

  8. Know your community: Model applications in field research

    USDA-ARS?s Scientific Manuscript database

    The focus of this community is to promote the application of cropping or range system models in field research to help evaluate and develop optimum agricultural systems and management to achieve long-term economic and environmental sustainability under a changing climate. Model applications to a var...

  9. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.

  10. On various metrics used for validation of predictive QSAR models with applications in virtual screening and focused library design.

    PubMed

    Roy, Kunal; Mitra, Indrani

    2011-07-01

    Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.

  11. Application of Consider Covariance to the Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Lundberg, John B.

    1996-01-01

    The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.

  12. 76 FR 46330 - NUREG-1934, Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG); Second Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0568] NUREG-1934, Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG); Second Draft Report for Comment AGENCY: Nuclear Regulatory Commission... 1023259), ``Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG), Second Draft Report for...

  13. Photochemical Modeling Applications

    EPA Pesticide Factsheets

    Provides access to modeling applications involving photochemical models, including modeling of ozone, particulate matter (PM), and mercury for national and regional EPA regulations such as the Clean Air Interstate Rule (CAIR) and the Clean Air Mercury Rule

  14. Remote sensing applications to hydrologic modeling

    NASA Technical Reports Server (NTRS)

    Dozier, J.; Estes, J. E.; Simonett, D. S.; Davis, R.; Frew, J.; Marks, D.; Schiffman, K.; Souza, M.; Witebsky, E.

    1977-01-01

    An energy balance snowmelt model for rugged terrain was devised and coupled to a flow model. A literature review of remote sensing applications to hydrologic modeling was included along with a software development outline.

  15. Development and application of air quality models at the US ...

    EPA Pesticide Factsheets

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  16. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individualmore » component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.« less

  17. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  18. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  19. On Structural Equation Model Equivalence.

    ERIC Educational Resources Information Center

    Raykov, Tenko; Penev, Spiridon

    1999-01-01

    Presents a necessary and sufficient condition for the equivalence of structural-equation models that is applicable to models with parameter restrictions and models that may or may not fulfill assumptions of the rules. Illustrates the application of the approach for studying model equivalence. (SLD)

  20. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  1. An Integrated Model of Application, Admission, Enrollment, and Financial Aid

    ERIC Educational Resources Information Center

    DesJardins, Stephen L.; Ahlburg, Dennis A.; McCall, Brian Patrick

    2006-01-01

    We jointly model the application, admission, financial aid determination, and enrollment decision process. We find that expectations of admission affect application probabilities, financial aid expectations affect enrollment and application behavior, and deviations from aid expectations are strongly related to enrollment. We also conduct…

  2. Review of Development Survey of Phase Change Material Models in Building Applications

    PubMed Central

    Akeiber, Hussein J.; Wahid, Mazlan A.; Hussen, Hasanen M.; Mohammad, Abdulrahman Th.

    2014-01-01

    The application of phase change materials (PCMs) in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data. PMID:25313367

  3. Modeling habitat for Marbled Murrelets on the Siuslaw National Forest, Oregon, using lidar data

    USGS Publications Warehouse

    Hagar, Joan C.; Aragon, Ramiro; Haggerty, Patricia; Hollenbeck, Jeff P.

    2018-03-28

    Habitat models using lidar-derived variables that quantify fine-scale variation in vegetation structure can improve the accuracy of occupancy estimates for canopy-dwelling species over models that use variables derived from other remote sensing techniques. However, the ability of models developed at such a fine spatial scale to maintain accuracy at regional or larger spatial scales has not been tested. We tested the transferability of a lidar-based habitat model for the threatened Marbled Murrelet (Brachyramphus marmoratus) between two management districts within a larger regional conservation zone in coastal western Oregon. We compared the performance of the transferred model against models developed with data from the application location. The transferred model had good discrimination (AUC = 0.73) at the application location, and model performance was further improved by fitting the original model with coefficients from the application location dataset (AUC = 0.79). However, the model selection procedure indicated that neither of these transferred models were considered competitive with a model trained on local data. The new model trained on data from the application location resulted in the selection of a slightly different set of lidar metrics from the original model, but both transferred and locally trained models consistently indicated positive relationships between the probability of occupancy and lidar measures of canopy structural complexity. We conclude that while the locally trained model had superior performance for local application, the transferred model could reasonably be applied to the entire conservation zone.

  4. Climate and atmospheric modeling studies

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The climate and atmosphere modeling research programs have concentrated on the development of appropriate atmospheric and upper ocean models, and preliminary applications of these models. Principal models are a one-dimensional radiative-convective model, a three-dimensional global model, and an upper ocean model. Principal applications were the study of the impact of CO2, aerosols, and the solar 'constant' on climate.

  5. Models in Science Education: Applications of Models in Learning and Teaching Science

    ERIC Educational Resources Information Center

    Ornek, Funda

    2008-01-01

    In this paper, I discuss different types of models in science education and applications of them in learning and teaching science, in particular physics. Based on the literature, I categorize models as conceptual and mental models according to their characteristics. In addition to these models, there is another model called "physics model" by the…

  6. Computer-Aided Geometry Modeling

    NASA Technical Reports Server (NTRS)

    Shoosmith, J. N. (Compiler); Fulton, R. E. (Compiler)

    1984-01-01

    Techniques in computer-aided geometry modeling and their application are addressed. Mathematical modeling, solid geometry models, management of geometric data, development of geometry standards, and interactive and graphic procedures are discussed. The applications include aeronautical and aerospace structures design, fluid flow modeling, and gas turbine design.

  7. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumazaki, Y; Miyaura, K; Hirai, R

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less

  8. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  9. Web-based application for inverting one-dimensional magnetotelluric data using Python

    NASA Astrophysics Data System (ADS)

    Suryanto, Wiwit; Irnaka, Theodosius Marwan

    2016-11-01

    One-dimensional modeling of magnetotelluric (MT) data has been performed using an online application on a web-based virtual private server. The application was developed with the Python language using the Django framework with HTML and CSS components. The input data, including the apparent resistivity and phase as a function of period or frequency with standard deviation, can be entered through an interactive web page that can be freely accessed at https://komputasi.geofisika.ugm.ac.id. The subsurface models, represented by resistivity as a function of depth, are iteratively improved by changing the model parameters, such as the resistivity and the layer depth, based on the observed apparent resistivity and phase data. The output of the application displayed on the screen presents resistivity as a function of depth and includes the RMS error for each iteration. Synthetic and real data were used in comparative tests of the application's performance, and it is shown that the application developed accurate subsurface resistivity models. Hence, this application can be used for practical one-dimensional modeling of MT data.

  10. Applicability of common stomatal conductance models in maize under varying soil moisture conditions.

    PubMed

    Wang, Qiuling; He, Qijin; Zhou, Guangsheng

    2018-07-01

    In the context of climate warming, the varying soil moisture caused by precipitation pattern change will affect the applicability of stomatal conductance models, thereby affecting the simulation accuracy of carbon-nitrogen-water cycles in ecosystems. We studied the applicability of four common stomatal conductance models including Jarvis, Ball-Woodrow-Berry (BWB), Ball-Berry-Leuning (BBL) and unified stomatal optimization (USO) models based on summer maize leaf gas exchange data from a soil moisture consecutive decrease manipulation experiment. The results showed that the USO model performed best, followed by the BBL model, BWB model, and the Jarvis model performed worst under varying soil moisture conditions. The effects of soil moisture made a difference in the relative performance among the models. By introducing a water response function, the performance of the Jarvis, BWB, and USO models improved, which decreased the normalized root mean square error (NRMSE) by 15.7%, 16.6% and 3.9%, respectively; however, the performance of the BBL model was negative, which increased the NRMSE by 5.3%. It was observed that the models of Jarvis, BWB, BBL and USO were applicable within different ranges of soil relative water content (i.e., 55%-65%, 56%-67%, 37%-79% and 37%-95%, respectively) based on the 95% confidence limits. Moreover, introducing a water response function, the applicability of the Jarvis and BWB models improved. The USO model performed best with or without introducing the water response function and was applicable under varying soil moisture conditions. Our results provide a basis for selecting appropriate stomatal conductance models under drought conditions. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Damping Models for Shear-Deformable Beam with Applications to Spacecraft Wiring Harness

    DTIC Science & Technology

    2014-10-28

    AFRL-RV-PS- TR-2014-0189 AFRL-RV-PS- TR-2014-0189 DAMPING MODELS FOR SHEAR-DEFORMABLE BEAM WITH APPLICATIONS TO SPACECRAFT WIRING HARNESS ...Feb 2012 4. TITLE AND SUBTITLE Damping Models for Shear-Deformable Beam with Applications to Spacecraft Wiring Harness 5a. CONTRACT NUMBER FA9453-12...behavior of wiring harnesses . The emphasis in this project will be on the extension of the shear-beam damping model to the Timoshenko beam, a beam model

  12. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  13. Bayesian model reduction and empirical Bayes for group (DCM) studies

    PubMed Central

    Friston, Karl J.; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E.; van Wijk, Bernadette C.M.; Ziegler, Gabriel; Zeidman, Peter

    2016-01-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level – e.g., dynamic causal models – and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  14. Design and implementation of space physics multi-model application integration based on web

    NASA Astrophysics Data System (ADS)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.

  15. Application of the Human Activity Assistive Technology model for occupational therapy research.

    PubMed

    Giesbrecht, Ed

    2013-08-01

    Theoretical models provide a framework for describing practice and integrating evidence into systematic research. There are few models that relate specifically to the provision of assistive technology in occupational therapy practice. The Human Activity Assistive Technology model is an enduring example that has continued to develop by integrating a social model of disability, concepts from occupational therapy theory and principles of assistive technology adoption and abandonment. This study first describes the core concepts of the Human Activity Assistive Technology model and reviews its development over three successive published versions. A review of the research literature reflects application of the model to clinical practice, study design, outcome measure selection and interpretation of results, particularly among occupational therapists. An evaluative framework is used to critique the adequacy of the Human Activity Assistive Technology model for practice and research, exploring attributes of clarity, simplicity, generality, accessibility and importance. Finally, recommendations are proposed for continued development of the model and research applications. Most of the existing research literature employs the Human Activity Assistive Technology model for background and study design; there is emerging evidence to support the core concepts as predictive factors. Although the concepts are generally simple, clear and applicable to occupational therapy practice and research, evolving terminology and outcomes become more complex with the conflation of integrated theories. The development of the Human Activity Assistive Technology model offers enhanced access and application for occupational therapists, but poses challenges to clarity among concepts. Suggestions are made for further development and applications of the model. © 2013 Occupational Therapy Australia.

  16. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    PubMed

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology. © 2016 John Wiley & Sons Ltd.

  17. gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.

    PubMed

    Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil

    2018-04-01

    Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.

  18. Elastic Network Models For Biomolecular Dynamics: Theory and Application to Membrane Proteins and Viruses

    NASA Astrophysics Data System (ADS)

    Lezon, Timothy R.; Shrivastava, Indira H.; Yang, Zheng; Bahar, Ivet

    The following sections are included: * Introduction * Theory and Assumptions * Statistical mechanical foundations * Anisotropic network models * Gaussian network model * Rigid block models * Treatment of perturbations * Langevin dynamics * Applications * Membrane proteins * Viruses * Conclusion * References

  19. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  20. Charge to Road Map Development Sessions

    NASA Technical Reports Server (NTRS)

    Barth, Janet

    2004-01-01

    Develop a road map for new standard Model applications radiation belt models. Model applications: Spacecraft and instruments. Reduce risk. Reduce cost. Improve performance. Increase system lifetime. Reduce risk to astronauts.

  1. 40 CFR 86.1807-01 - Vehicle labeling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-Fueled 20XX Model Year New Motor... Applicable to XXX-Fueled 20XX Model Year New Light-Duty Trucks.” (C) For medium-duty passenger vehicles, the statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-fueled 20XX Model Year New...

  2. Applications of a thermal-based two-source energy balance model using Priestley-Taylor approach for surface temperature partitioning (TSEB_PTT) under advective conditions

    USDA-ARS?s Scientific Manuscript database

    Operational application of the two source energy balance model (TSEB) which can estimate evaportranspiration (ET) and the components evaporation (E), transpiration (T) of the land surface in different climates is very useful for many applications in hydrology and agriculture. The TSEB model uses an ...

  3. Population balance modeling: current status and future prospects.

    PubMed

    Ramkrishna, Doraiswami; Singh, Meenesh R

    2014-01-01

    Population balance modeling is undergoing phenomenal growth in its applications, and this growth is accompanied by multifarious reviews. This review aims to fortify the model's fundamental base, as well as point to a variety of new applications, including modeling of crystal morphology, cell growth and differentiation, gene regulatory processes, and transfer of drug resistance. This is accomplished by presenting the many faces of population balance equations that arise in the foregoing applications.

  4. Examination of various turbulence models for application in liquid rocket thrust chambers

    NASA Technical Reports Server (NTRS)

    Hung, R. J.

    1991-01-01

    There is a large variety of turbulence models available. These models include direct numerical simulation, large eddy simulation, Reynolds stress/flux model, zero equation model, one equation model, two equation k-epsilon model, multiple-scale model, etc. Each turbulence model contains different physical assumptions and requirements. The natures of turbulence are randomness, irregularity, diffusivity and dissipation. The capabilities of the turbulence models, including physical strength, weakness, limitations, as well as numerical and computational considerations, are reviewed. Recommendations are made for the potential application of a turbulence model in thrust chamber and performance prediction programs. The full Reynolds stress model is recommended. In a workshop, specifically called for the assessment of turbulence models for applications in liquid rocket thrust chambers, most of the experts present were also in favor of the recommendation of the Reynolds stress model.

  5. Application and enhancements of MOVIE.BYU

    NASA Technical Reports Server (NTRS)

    Gates, R. L.; Vonofenheim, W. H.

    1984-01-01

    MOVIE.BYU (MOVIE.BRIGHAM YOUNG UNIVERSITY) is a system of programs for the display and manipulation of data representing mathematical, architectural, and topological models in which the geometry may be described in terms of panel (n-sided polygons) and solid elements or contour lines. The MOVIE.BYU system has been used in a series of applications of LaRC. One application has been the display, creation, and manipulation of finite element models in aeronautic/aerospace research. These models have been displayed on both vector and color raster devices, and the user has the option to modify color and shading parameters on these color raster devices. Another application involves the display of scalar functions (temperature, pressure, etc.) over the surface of a given model. This capability gives the researcher added flexibility in the analysis of the model and its accompanying data. Limited animation (frame-by-frame creation) has been another application of MOVIE.BYU in the modeling of kinematic processes in antenna structures.

  6. MaMR: High-performance MapReduce programming model for material cloud applications

    NASA Astrophysics Data System (ADS)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  7. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  9. DEVELOPMENT OF A LAND-SURFACE MODEL PART I: APPLICATION IN A MESOSCALE METEOROLOGY MODEL

    EPA Science Inventory

    Parameterization of land-surface processes and consideration of surface inhomogeneities are very important to mesoscale meteorological modeling applications, especially those that provide information for air quality modeling. To provide crucial, reliable information on the diurn...

  10. COLLABORATIONS AND SPECIALIZED CLIENT INTERACTIONS

    EPA Science Inventory

    The goal of this task is to improve our understanding of atmospheric modeling research applications through collaborations with the international air pollution community and to demonstrate the applicability of our AQ models for their utility through technical applications by clie...

  11. Understanding the Behaviors of Stealth Applicants in the College Search Process

    ERIC Educational Resources Information Center

    Dupaul, Stephanie

    2010-01-01

    Successful enrollment management uses predictive modeling to achieve specific goals for admission rates, yield rates, and class size. Many of these models rely on evaluating an applicant's interest in the institution through measures of pre-application engagement. Recent increases in the number of applicants who do not visibly interact with…

  12. Selection and Classification Using a Forecast Applicant Pool.

    ERIC Educational Resources Information Center

    Hendrix, William H.

    The document presents a forecast model of the future Air Force applicant pool. By forecasting applicants' quality (means and standard deviations of aptitude scores) and quantity (total number of applicants), a potential enlistee could be compared to the forecasted pool. The data used to develop the model consisted of means, standard deviation, and…

  13. Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd; Callahan, John R.; Whetten, Brian

    1996-01-01

    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  14. Hybrid modeling for quality by design and PAT-benefits and challenges of applications in biopharmaceutical industry.

    PubMed

    von Stosch, Moritz; Davy, Steven; Francois, Kjell; Galvanauskas, Vytautas; Hamelink, Jan-Martijn; Luebbert, Andreas; Mayer, Martin; Oliveira, Rui; O'Kennedy, Ronan; Rice, Paul; Glassey, Jarka

    2014-06-01

    This report highlights the drivers, challenges, and enablers of the hybrid modeling applications in biopharmaceutical industry. It is a summary of an expert panel discussion of European academics and industrialists with relevant scientific and engineering backgrounds. Hybrid modeling is viewed in its broader sense, namely as the integration of different knowledge sources in form of parametric and nonparametric models into a hybrid semi-parametric model, for instance the integration of fundamental and data-driven models. A brief description of the current state-of-the-art and industrial uptake of the methodology is provided. The report concludes with a number of recommendations to facilitate further developments and a wider industrial application of this modeling approach. These recommendations are limited to further exploiting the benefits of this methodology within process analytical technology (PAT) applications in biopharmaceutical industry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A hierarchical distributed control model for coordinating intelligent systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1991-01-01

    A hierarchical distributed control (HDC) model for coordinating cooperative problem-solving among intelligent systems is described. The model was implemented using SOCIAL, an innovative object-oriented tool for integrating heterogeneous, distributed software systems. SOCIAL embeds applications in 'wrapper' objects called Agents, which supply predefined capabilities for distributed communication, control, data specification, and translation. The HDC model is realized in SOCIAL as a 'Manager'Agent that coordinates interactions among application Agents. The HDC Manager: indexes the capabilities of application Agents; routes request messages to suitable server Agents; and stores results in a commonly accessible 'Bulletin-Board'. This centralized control model is illustrated in a fault diagnosis application for launch operations support of the Space Shuttle fleet at NASA, Kennedy Space Center.

  16. Model-Driven Approach for Body Area Network Application Development.

    PubMed

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  17. Model-Driven Approach for Body Area Network Application Development

    PubMed Central

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  18. A brief overview of the theory and application of the optimal control model of the human operator

    NASA Technical Reports Server (NTRS)

    Sheldon, B.

    1979-01-01

    The underlying motivation and concepts are presented, along with a review of the development and application of the model. The structure of the model is described and results validating the model are presented.

  19. Scalability of Classical Terramechanics Models for Lightweight Vehicle Applications

    DTIC Science & Technology

    2013-08-01

    Models for Lightweight Vehicle Applications Paramsothy Jayakumar Daniel Melanz Jamie MacLennan U.S. Army TARDEC Warren, MI, USA Carmine...NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paramsothy Jayakumar ; Daniel Melanz; Jamie MacLennan; Carmine Senatore; Karl Iagnemma 5d. PROJECT...GVSETS), UNCLASSIFIED Scalability of Classical Terramechanics Models for Lightweight Vehicle Applications, Jayakumar , et al., UNCLASSIFIED Page 1 of 19

  20. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  1. Performance Evaluation Model for Application Layer Firewalls.

    PubMed

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  2. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  3. Application of optimization technique for flood damage modeling in river system

    NASA Astrophysics Data System (ADS)

    Barman, Sangita Deb; Choudhury, Parthasarathi

    2018-04-01

    A river system is defined as a network of channels that drains different parts of a basin uniting downstream to form a common outflow. An application of various models found in literatures, to a river system having multiple upstream flows is not always straight forward, involves a lengthy procedure; and with non-availability of data sets model calibration and applications may become difficult. In the case of a river system the flow modeling can be simplified to a large extent if the channel network is replaced by an equivalent single channel. In the present work optimization model formulations based on equivalent flow and applications of the mixed integer programming based pre-emptive goal programming model in evaluating flood control alternatives for a real life river system in India are proposed to be covered in the study.

  4. Nanoparticle surface characterization and clustering through concentration-dependent surface adsorption modeling.

    PubMed

    Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E

    2014-09-23

    Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  5. Integrating in silico models to enhance predictivity for developmental toxicity.

    PubMed

    Marzo, Marco; Kulkarni, Sunil; Manganaro, Alberto; Roncaglioni, Alessandra; Wu, Shengde; Barton-Maclaren, Tara S; Lester, Cathy; Benfenati, Emilio

    2016-08-31

    Application of in silico models to predict developmental toxicity has demonstrated limited success particularly when employed as a single source of information. It is acknowledged that modelling the complex outcomes related to this endpoint is a challenge; however, such models have been developed and reported in the literature. The current study explored the possibility of integrating the selected public domain models (CAESAR, SARpy and P&G model) with the selected commercial modelling suites (Multicase, Leadscope and Derek Nexus) to assess if there is an increase in overall predictive performance. The results varied according to the data sets used to assess performance which improved upon model integration relative to individual models. Moreover, because different models are based on different specific developmental toxicity effects, integration of these models increased the applicable chemical and biological spaces. It is suggested that this approach reduces uncertainty associated with in silico predictions by achieving a consensus among a battery of models. The use of tools to assess the applicability domain also improves the interpretation of the predictions. This has been verified in the case of the software VEGA, which makes freely available QSAR models with a measurement of the applicability domain. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Palm: Easing the Burden of Analytical Performance Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less

  7. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    PubMed

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter

    2016-03-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Advances and applications of occupancy models

    USGS Publications Warehouse

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  9. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  10. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  11. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE PAGES

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.; ...

    2016-05-01

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  12. Review of Land Use Models: Theory and Application

    DOT National Transportation Integrated Search

    1997-01-01

    This paper discusses methodology in reviewing land use models and identifying desired attributes for recommending a model for application by the Delaware Valley Planning Commission (DVRPC). The need for land-use transportation interaction is explored...

  13. Theoretical models for application in school health education research.

    PubMed

    Parcel, G S

    1984-01-01

    Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.

  14. A Component-based Programming Model for Composite, Distributed Applications

    NASA Technical Reports Server (NTRS)

    Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.

  15. Hideen Markov Models and Neural Networks for Fault Detection in Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1994-01-01

    None given. (From conclusion): Neural networks plus Hidden Markov Models(HMM)can provide excellene detection and false alarm rate performance in fault detection applications. Modified models allow for novelty detection. Also covers some key contributions of neural network model, and application status.

  16. Supporting Collaborative Model and Data Service Development and Deployment with DevOps

    NASA Astrophysics Data System (ADS)

    David, O.

    2016-12-01

    Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.

  17. Performance measurement and modeling of component applications in a high performance computing environment : a case study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Robert C.; Ray, Jaideep; Malony, A.

    2003-11-01

    We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.

  18. A multiphase non-linear mixed effects model: An application to spirometry after lung transplantation.

    PubMed

    Rajeswaran, Jeevanantham; Blackstone, Eugene H

    2017-02-01

    In medical sciences, we often encounter longitudinal temporal relationships that are non-linear in nature. The influence of risk factors may also change across longitudinal follow-up. A system of multiphase non-linear mixed effects model is presented to model temporal patterns of longitudinal continuous measurements, with temporal decomposition to identify the phases and risk factors within each phase. Application of this model is illustrated using spirometry data after lung transplantation using readily available statistical software. This application illustrates the usefulness of our flexible model when dealing with complex non-linear patterns and time-varying coefficients.

  19. Domain and Specification Models for Software Engineering

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper discusses our approach to representing application domain knowledge for specific software engineering tasks. Application domain knowledge is embodied in a domain model. Domain models are used to assist in the creation of specification models. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model. One aspect of the system-hierarchical organization is described in detail.

  20. GLOBAL REFERENCE ATMOSPHERIC MODELS FOR AEROASSIST APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    Aeroassist is a broad category of advanced transportation technology encompassing aerocapture, aerobraking, aeroentry, precision landing, hazard detection and avoidance, and aerogravity assist. The eight destinations in the Solar System with sufficient atmosphere to enable aeroassist technology are Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for five of these targets - Earth, Mars, Titan, Neptune, and Venus - have been developed at NASA's Marshall Space Flight Center. These models are useful as tools in mission planning and systems analysis studies associated with aeroassist applications. The series of models is collectively named the Global Reference Atmospheric Model or GRAM series. An important capability of all the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analysis in developing guidance, navigation and control algorithms, for aerothermal design, and for other applications sensitive to atmospheric variability. Recent example applications are discussed.

  1. Nurse practitioners in aged care: documentary analysis of successful project proposals.

    PubMed

    Clark, Shannon J; Parker, Rhian M; Davey, Rachel

    2014-11-01

    Meeting the primary health care needs of an aging population is an increasing challenge for many Western nations. In Australia, the federal government introduced a program to develop, test, and evaluate nurse practitioner models in aged care settings. In this article, we present a documentary analysis of 32 project proposals awarded funding under the Nurse Practitioner-Aged Care Models of Practice Program. Successfully funded models were diverse and were operated by a range of organizations across Australia. We identified three key priorities as underlying the proposed models: "The right care," "in the right place," and "at the right time." In this article, we explore how these priorities were presented by different applicants in different ways. Through the presentation of their models, the program's applicants identified and proposed to address current gaps in health services. Applicants contrasted their proposed models with available services to create persuasive and competitive applications for funding. © The Author(s) 2014.

  2. Praedicere Possumus: An Italian web-based application for predictive microbiology to ensure food safety.

    PubMed

    Polese, Pierluigi; Torre, Manuela Del; Stecchini, Mara Lucia

    2018-03-31

    The use of predictive modelling tools, which mainly describe the response of microorganisms to a particular set of environmental conditions, may contribute to a better understanding of microbial behaviour in foods. In this paper, a tertiary model, in the form of a readily available and userfriendly web-based application Praedicere Possumus (PP) is presented with research examples from our laboratories. Through the PP application, users have access to different modules, which apply a set of published models considered reliable for determining the compliance of a food product with EU safety criteria and for optimising processing throughout the identification of critical control points. The application pivots around a growth/no-growth boundary model, coupled with a growth model, and includes thermal and non-thermal inactivation models. Integrated functionalities, such as the fractional contribution of each inhibitory factor to growth probability (f) and the time evolution of the growth probability (P t ), have also been included. The PP application is expected to assist food industry and food safety authorities in their common commitment towards the improvement of food safety.

  3. The application of CFD to the modelling of fires in complex geometries

    NASA Astrophysics Data System (ADS)

    Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.

    The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.

  4. MRI segmentation by active contours model, 3D reconstruction, and visualization

    NASA Astrophysics Data System (ADS)

    Lopez-Hernandez, Juan M.; Velasquez-Aguilar, J. Guadalupe

    2005-02-01

    The advances in 3D data modelling methods are becoming increasingly popular in the areas of biology, chemistry and medical applications. The Nuclear Magnetic Resonance Imaging (NMRI) technique has progressed at a spectacular rate over the past few years, its uses have been spread over many applications throughout the body in both anatomical and functional investigations. In this paper we present the application of Zernike polynomials for 3D mesh model of the head using the contour acquired of cross-sectional slices by active contour model extraction and we propose the visualization with OpenGL 3D Graphics of the 2D-3D (slice-surface) information for the diagnostic aid in medical applications.

  5. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  6. Uncertainty, ensembles and air quality dispersion modeling: applications and challenges

    NASA Astrophysics Data System (ADS)

    Dabberdt, Walter F.; Miller, Erik

    The past two decades have seen significant advances in mesoscale meteorological modeling research and applications, such as the development of sophisticated and now widely used advanced mesoscale prognostic models, large eddy simulation models, four-dimensional data assimilation, adjoint models, adaptive and targeted observational strategies, and ensemble and probabilistic forecasts. Some of these advances are now being applied to urban air quality modeling and applications. Looking forward, it is anticipated that the high-priority air quality issues for the near-to-intermediate future will likely include: (1) routine operational forecasting of adverse air quality episodes; (2) real-time high-level support to emergency response activities; and (3) quantification of model uncertainty. Special attention is focused here on the quantification of model uncertainty through the use of ensemble simulations. Application to emergency-response dispersion modeling is illustrated using an actual event that involved the accidental release of the toxic chemical oleum. Both surface footprints of mass concentration and the associated probability distributions at individual receptors are seen to provide valuable quantitative indicators of the range of expected concentrations and their associated uncertainty.

  7. Mixed Membership Distributions with Applications to Modeling Multiple Strategy Usage

    ERIC Educational Resources Information Center

    Galyardt, April

    2012-01-01

    This dissertation examines two related questions. "How do mixed membership models work?" and "Can mixed membership be used to model how students use multiple strategies to solve problems?". Mixed membership models have been used in thousands of applications from text and image processing to genetic microarray analysis. Yet…

  8. Multilevel Modeling: A Review of Methodological Issues and Applications

    ERIC Educational Resources Information Center

    Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.

    2009-01-01

    This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…

  9. Watershed modeling applications in south Texas

    USGS Publications Warehouse

    Pedraza, Diana E.; Ockerman, Darwin J.

    2012-01-01

    This fact sheet presents an overview of six selected watershed modeling studies by the USGS and partners that address a variety of water-resource issues in south Texas. These studies provide examples of modeling applications and demonstrate the usefulness and versatility of watershed models in aiding the understanding of hydrologic systems.

  10. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  11. Merging Applicability Domains for in Silico Assessment of Chemical Mutagenicity

    DTIC Science & Technology

    2014-02-04

    molecular fingerprints as descriptors for developing quantitative structure−activity relationship ( QSAR ) models and defining applicability domains with...used to define and quantify an applicability domain for either method. The importance of using applicability domains in QSAR modeling cannot be...domain from roughly 80% to 90%. These results indicated that the proposed QSAR protocol constituted a highly robust chemical mutagenicity prediction

  12. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    NASA Astrophysics Data System (ADS)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  13. Evaluating the Power Consumption of Wireless Sensor Network Applications Using Models

    PubMed Central

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-01-01

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement. PMID:23486217

  14. Evaluating the power consumption of wireless sensor network applications using models.

    PubMed

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-03-13

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement.

  15. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    NASA Astrophysics Data System (ADS)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  16. Developing a Fundamental Model for an Integrated GPS/INS State Estimation System with Kalman Filtering

    NASA Technical Reports Server (NTRS)

    Canfield, Stephen

    1999-01-01

    This work will demonstrate the integration of sensor and system dynamic data and their appropriate models using an optimal filter to create a robust, adaptable, easily reconfigurable state (motion) estimation system. This state estimation system will clearly show the application of fundamental modeling and filtering techniques. These techniques are presented at a general, first principles level, that can easily be adapted to specific applications. An example of such an application is demonstrated through the development of an integrated GPS/INS navigation system. This system acquires both global position data and inertial body data, to provide optimal estimates of current position and attitude states. The optimal states are estimated using a Kalman filter. The state estimation system will include appropriate error models for the measurement hardware. The results of this work will lead to the development of a "black-box" state estimation system that supplies current motion information (position and attitude states) that can be used to carry out guidance and control strategies. This black-box state estimation system is developed independent of the vehicle dynamics and therefore is directly applicable to a variety of vehicles. Issues in system modeling and application of Kalman filtering techniques are investigated and presented. These issues include linearized models of equations of state, models of the measurement sensors, and appropriate application and parameter setting (tuning) of the Kalman filter. The general model and subsequent algorithm is developed in Matlab for numerical testing. The results of this system are demonstrated through application to data from the X-33 Michael's 9A8 mission and are presented in plots and simple animations.

  17. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    USGS Publications Warehouse

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  18. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  19. 40 CFR 89.210 - Maintenance of records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... limits (FEL); (3) Power rating for each configuration tested; (4) Projected applicable production/sales volume for the model year; and (5) Actual applicable production/sales volume for the model year. (c) Any... actual quarterly and cumulative applicable production/sales volume; (3) The values required to calculate...

  20. 40 CFR 600.501-12 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy § 600.501-12 General applicability. The provisions of this subpart are applicable to 2012 and later model year passenger automobiles...

  1. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    EPA Science Inventory

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  2. Maritime Platform Sleep and Performance Study: Evaluating the SAFTE Model for Maritime Workplace Application

    DTIC Science & Technology

    2012-06-01

    SLEEP AND PERFORMANCE STUDY: EVALUATING THE SAFTE MODEL FOR MARITIME WORKPLACE APPLICATION by Stephanie A. T. Brown June 2012 Thesis...REPORT DATE June 2012 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Maritime Platform Sleep and Performance Study...Evaluating the SAFTE Model for Maritime Workplace Application 5. FUNDING NUMBERS 6. AUTHOR(S) Stephanie A. T. Brown 7. PERFORMING ORGANIZATION

  3. A Ceramic Fracture Model for High Velocity Impact

    DTIC Science & Technology

    1993-05-01

    employ damage concepts appear more relevant than crack growth models for this application . This research adopts existing fracture model concepts and...extends them through applications in an existing finite element continuum mechanics code (hydrocode) to the prediction of the damage and fracture processes...to be accurate in the lower velocity range of this work. Mescall and Tracy 15] investigated the selection of ceramic material for application in armors

  4. Developing a java android application of KMV-Merton default rate model

    NASA Astrophysics Data System (ADS)

    Yusof, Norliza Muhamad; Anuar, Aini Hayati; Isa, Norsyaheeda Natasha; Zulkafli, Sharifah Nursyuhada Syed; Sapini, Muhamad Luqman

    2017-11-01

    This paper presents a developed java android application for KMV-Merton model in predicting the defaut rate of a firm. Predicting default rate is essential in the risk management area as default risk can be immediately transmitted from one entity to another entity. This is the reason default risk is known as a global risk. Although there are several efforts, instruments and methods used to manage the risk, it is said to be insufficient. To the best of our knowledge, there has been limited innovation in developing the default risk mathematical model into a mobile application. Therefore, through this study, default risk is predicted quantitatively using the KMV-Merton model. The KMV-Merton model has been integrated in the form of java program using the Android Studio Software. The developed java android application is tested by predicting the levels of default risk of the three different rated companies. It is found that the levels of default risk are equivalent to the ratings of the respective companies. This shows that the default rate predicted by the KMV-Merton model using the developed java android application can be a significant tool to the risk mangement field. The developed java android application grants users an alternative to predict level of default risk within less procedure.

  5. Snow Micro-Structure Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Micah Johnson, Andrew Slaughter

    PIKA is a MOOSE-based application for modeling micro-structure evolution of seasonal snow. The model will be useful for environmental, atmospheric, and climate scientists. Possible applications include application to energy balance models, ice sheet modeling, and avalanche forecasting. The model implements physics from published, peer-reviewed articles. The main purpose is to foster university and laboratory collaboration to build a larger multi-scale snow model using MOOSE. The main feature of the code is that it is implemented using the MOOSE framework, thus making features such as multiphysics coupling, adaptive mesh refinement, and parallel scalability native to the application. PIKA implements three equations:more » the phase-field equation for tracking the evolution of the ice-air interface within seasonal snow at the grain-scale; the heat equation for computing the temperature of both the ice and air within the snow; and the mass transport equation for monitoring the diffusion of water vapor in the pore space of the snow.« less

  6. Operational Monitoring of Data Production at KNMI

    NASA Astrophysics Data System (ADS)

    van de Vegte, John; Kwidama, Anecita; van Moosel, Wim; Oosterhof, Rijk; de Wit de Wit, Ronny; Klein Ikkink, Henk Jan; Som de Cerff, Wim; Verhoef, Hans; Koutek, Michal; Duin, Frank; van der Neut, Ian; verhagen, Robert; Wollerich, Rene

    2016-04-01

    Within KNMI a new fully automated system for monitoring the KNMI operational data production systems is being developed: PRISMA (PRocessflow Infrastructure Surveillance and Monitoring Application). Currently the KNMI operational (24/7) production systems consist of over 60 applications, running on different hardware systems and platforms. They are interlinked for the production of numerous data products, which are delivered to internal and external customers. Traditionally these applications are individually monitored by different applications or not at all; complicating root cause and impact analysis. Also, the underlying hardware and network is monitored via an isolated application. Goal of the PRISMA system is to enable production chain monitoring, which enables root cause analysis (what is the root cause of the disruption) and impact analysis (what downstream products/customers will be effected). The PRISMA system will make it possible to reduce existing monitoring applications and provides one interface for monitoring the data production. For modeling and storing the state of the production chains a graph database is used. The model is automatically updated by the applications and systems which are to be monitored. The graph models enables root cause and impact analysis. In the PRISMA web interface interaction with the graph model is accomplished via a graphical representation. The presentation will focus on aspects of: • Modeling real world computers, applications, products to a conceptual model; • Architecture of the system; • Configuration information and (real world) event handling of the to be monitored objects; • Implementation rules for root cause and impact analysis. • How PRISMA was developed (methodology, facts, results) • Presentation of the PRISMA system as it now looks and works

  7. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  8. [Analysis of the model OPM3® application and results for health area].

    PubMed

    Augusto Dos Santos, Luis; de Fátima Marin, Heimar

    2011-01-01

    This research sought to analyze if a questionnaire model created by an international community of project management is applicable to health organizations. The model OPM3 ® (Organizational Project Management Maturity Model) was created in order that organizations of any area or size can identify the presence or absence of good management practices. The aim of applying this model is to always evaluate the organization and not the interviewee. In this paper, one presents the results of employing this model in an organization that has information technology products and services applied to health area. This study verified that the model is rapidly applicable and that the analyzed organization has an expressive number of good practices.

  9. Village power options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lilienthal, P.

    1997-12-01

    This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less

  10. BioMOL: a computer-assisted biological modeling tool for complex chemical mixtures and biological processes at the molecular level.

    PubMed Central

    Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J

    2002-01-01

    A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134

  11. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errorsmore » of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.« less

  12. Applicability of land use models for the Houston area test site

    NASA Technical Reports Server (NTRS)

    Petersburg, R. K.; Bradford, L. H.

    1973-01-01

    Descriptions of land use models are presented which were considered for their applicability to the Houston Area Test Site. These models are representative both of the prevailing theories of land use dynamics and of basic approaches to simulation. The models considered are: a model of metropolis, land use simulation model, emperic land use forecasting model, a probabilistic model for residential growth, and the regional environmental management allocation process. Sources of environmental/resource information are listed.

  13. Outcome of a Workshop on Applications of Protein Models in Biomedical Research

    PubMed Central

    Schwede, Torsten; Sali, Andrej; Honig, Barry; Levitt, Michael; Berman, Helen M.; Jones, David; Brenner, Steven E.; Burley, Stephen K.; Das, Rhiju; Dokholyan, Nikolay V.; Dunbrack, Roland L.; Fidelis, Krzysztof; Fiser, Andras; Godzik, Adam; Huang, Yuanpeng Janet; Humblet, Christine; Jacobson, Matthew P.; Joachimiak, Andrzej; Krystek, Stanley R.; Kortemme, Tanja; Kryshtafovych, Andriy; Montelione, Gaetano T.; Moult, John; Murray, Diana; Sanchez, Roberto; Sosnick, Tobin R.; Standley, Daron M.; Stouch, Terry; Vajda, Sandor; Vasquez, Max; Westbrook, John D.; Wilson, Ian A.

    2009-01-01

    Summary We describe the proceedings and conclusions from a “Workshop on Applications of Protein Models in Biomedical Research” that was held at University of California at San Francisco on 11 and 12 July, 2008. At the workshop, international scientists involved with structure modeling explored (i) how models are currently used in biomedical research, (ii) what the requirements and challenges for different applications are, and (iii) how the interaction between the computational and experimental research communities could be strengthened to advance the field. PMID:19217386

  14. Knowledge sifters in MDA technologies

    NASA Astrophysics Data System (ADS)

    Kravchenko, Yuri; Kursitys, Ilona; Bova, Victoria

    2018-05-01

    The article considers a new approach to efficient management of information processes on the basis of object models. With the help of special design tools, a generic and application-independent application model is created, and then the program is implemented in a specific development environment. At the same time, the development process is completely based on a model that must contain all the information necessary for programming. The presence of a detailed model provides the automatic creation of typical parts of the application, the development of which is amenable to automation.

  15. Application of the GERTS II simulator in the industrial environment.

    NASA Technical Reports Server (NTRS)

    Whitehouse, G. E.; Klein, K. I.

    1971-01-01

    GERT was originally developed to aid in the analysis of stochastic networks. GERT can be used to graphically model and analyze complex systems. Recently a simulator model, GERTS II, has been developed to solve GERT Networks. The simulator language used in the development of this model was GASP II A. This paper discusses the possible application of GERTS II to model and analyze (1) assembly line operations, (2) project management networks, (3) conveyor systems and (4) inventory systems. Finally, an actual application dealing with a job shop loading problem is presented.

  16. Computer modeling and simulation of human movement. Applications in sport and rehabilitation.

    PubMed

    Neptune, R R

    2000-05-01

    Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.

  17. Development and application of a 3-D geometry/mass model for LDEF satellite ionizing radiation assessments

    NASA Technical Reports Server (NTRS)

    Colborn, B. L.; Armstong, T. W.

    1993-01-01

    A three-dimensional geometry and mass model of the Long Duration Exposure Facility (LDEF) spacecraft and experiment trays was developed for use in predictions and data interpretation related to ionizing radiation measurements. The modeling approach, level of detail incorporated, example models for specific experiments and radiation dosimeters, and example applications of the model are described.

  18. Application of ''Earl's Assessment "as", Assessment "for", and Assessment "of" Learning Model'' with Orthopaedic Assessment Clinical Competence

    ERIC Educational Resources Information Center

    Lafave, Mark R.; Katz, Larry; Vaughn, Norman

    2013-01-01

    Context: In order to study the efficacy of assessment methods, a theoretical framework of Earl's model of assessment was introduced. Objective: (1) Introduce the predictive learning assessment model (PLAM) as an application of Earl's model of learning; (2) test Earl's model of learning through the use of the Standardized Orthopedic Assessment Tool…

  19. A connectionist model for dynamic control

    NASA Technical Reports Server (NTRS)

    Whitfield, Kevin C.; Goodall, Sharon M.; Reggia, James A.

    1989-01-01

    The application of a connectionist modeling method known as competition-based spreading activation to a camera tracking task is described. The potential is explored for automation of control and planning applications using connectionist technology. The emphasis is on applications suitable for use in the NASA Space Station and in related space activities. The results are quite general and could be applicable to control systems in general.

  20. Basic Research on Adaptive Model Algorithmic Control

    DTIC Science & Technology

    1985-12-01

    Control Conference. Richalet, J., A. Rault, J.L. Testud and J. Papon (1978). Model predictive heuristic control: applications to industrial...pp.977-982. Richalet, J., A. Rault, J. L. Testud and J. Papon (1978). Model predictive heuristic control: applications to industrial processes

  1. Application of First Principles Ni-Cd and Ni-H2 Battery Models to Spacecraft Operations

    NASA Technical Reports Server (NTRS)

    Timmerman, Paul; Bugga, Ratnakumar; DiStefano, Salvador

    1997-01-01

    The conclusions of the application of first principles model to spacecraft operations are: the first principles of Bi-phasic electrode presented model provides an explanation for many behaviors on voltage fading on LEO cycling.

  2. Usefulness of Neuro-Fuzzy Models' Application for Tobacco Control

    NASA Astrophysics Data System (ADS)

    Petrovic-Lazarevic, Sonja; Zhang, Jian Ying

    2007-12-01

    The paper presents neuro-fuzzy models' application appropriate for tobacco control: the fuzzy control model, Adaptive Network Based Fuzzy Inference System, Evolving Fuzzy Neural Network models, and EVOlving POLicies. We propose further the use of Fuzzy Casual Networks to help tobacco control decision makers develop policies and measure their impact on social regulation.

  3. Econometric Models of Education, Some Applications. Education and Development, Technical Reports.

    ERIC Educational Resources Information Center

    Tinbergen, Jan; And Others

    This report contains five papers which describe mathematical models of the educational system as it relates to economic growth. Experimental applications of the models to particular educational systems are discussed. Three papers, by L. J. Emmerij, J. Blum, and G. Williams, discuss planning models for the calculation of educational requirements…

  4. Modeling Students' Memory for Application in Adaptive Educational Systems

    ERIC Educational Resources Information Center

    Pelánek, Radek

    2015-01-01

    Human memory has been thoroughly studied and modeled in psychology, but mainly in laboratory setting under simplified conditions. For application in practical adaptive educational systems we need simple and robust models which can cope with aspects like varied prior knowledge or multiple-choice questions. We discuss and evaluate several models of…

  5. Crop model application to soybean irrigation management in the mid-south USA

    USDA-ARS?s Scientific Manuscript database

    Since mid 1990s, there have been a rapid development and application of crop growth models such as APEX (the Agricultural Policy/Environmental eXtender) and RZWQM2 (Root Zone Water Quality Model). Such process-oriented models have been designed to study the interactions of genetypes, weather, soil, ...

  6. The Utility of IRT in Small-Sample Testing Applications.

    ERIC Educational Resources Information Center

    Sireci, Stephen G.

    The utility of modified item response theory (IRT) models in small sample testing applications was studied. The modified IRT models were modifications of the one- and two-parameter logistic models. One-, two-, and three-parameter models were also studied. Test data were from 4 years of a national certification examination for persons desiring…

  7. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  8. Development and application of damage assessment modeling: example assessment for the North Cape oil spill.

    PubMed

    McCay, Deborah French

    2003-01-01

    Natural resource damage assessment (NRDA) models for oil spills have been under development since 1984. Generally applicable (simplified) versions with built-in data sets are included in US government regulations for NRDAs in US waters. The most recent version of these models is SIMAP (Spill Impact Model Application Package), which contains oil fates and effects models that may be applied to any spill event and location in marine or freshwater environments. It is often not cost-effective or even possible to quantify spill impacts using field data collections. Modeling allows quantification of spill impacts using as much site-specific data as available, either as input or as validation of model results. SIMAP was used for the North Cape oil spill in Rhode Island (USA) in January 1996, for injury quantification in the first and largest NRDA case to be performed under the 1996 Oil Pollution Act NRDA regulations. The case was successfully settled in 1999. This paper, which contains a description of the model and application to the North Cape spill, delineates and demonstrates the approach.

  9. Counseling Model Application: A Student Career Development Guidance for Decision Maker and Consultation

    NASA Astrophysics Data System (ADS)

    Irwan; Gustientiedina; Sunarti; Desnelita, Yenny

    2017-12-01

    The purpose of this study is to design a counseling model application for a decision-maker and consultation system. This application as an alternative guidance and individual career development for students, that include career knowledge, planning and alternative options from an expert tool based on knowledge and rule to provide the solutions on student’s career decisions. This research produces a counseling model application to obtain the important information about student career development and facilitating individual student’s development through the service form, to connect their plan with their career according to their talent, interest, ability, knowledge, personality and other supporting factors. This application model can be used as tool to get information faster and flexible for the student’s guidance and counseling. So, it can help students in doing selection and making decision that appropriate with their choice of works.

  10. Computational Flow Analysis of Ultra High Pressure Firefighting Technology with Application to Long Range Nozzle Design

    DTIC Science & Technology

    2010-03-01

    release; distribution unlimited. Ref AFRL/RXQ Public Affairs Case # 10-100. Document contains color images . Although aqueous fire fighting agent...in conjunction with the standard Eulerian multiphase flow model. The two- equation k- model was selected due to its wide industrial application in...energy (k) and its dissipation rate (). Because of their heuristic development, RANS models have applicable limitations and in general must be

  11. An ocean scatter propagation model for aeronautical satellite communication applications

    NASA Technical Reports Server (NTRS)

    Moreland, K. W.

    1990-01-01

    In this paper an ocean scattering propagation model, developed for aircraft-to-satellite (aeronautical) applications, is described. The purpose of the propagation model is to characterize the behavior of sea reflected multipath as a function of physical propagation path parameters. An accurate validation against the theoretical far field solution for a perfectly conducting sinusoidal surface is provided. Simulation results for typical L band aeronautical applications with low complexity antennas are presented.

  12. A Multiphase Non-Linear Mixed Effects Model: An Application to Spirometry after Lung Transplantation

    PubMed Central

    Rajeswaran, Jeevanantham; Blackstone, Eugene H.

    2014-01-01

    In medical sciences, we often encounter longitudinal temporal relationships that are non-linear in nature. The influence of risk factors may also change across longitudinal follow-up. A system of multiphase non-linear mixed effects model is presented to model temporal patterns of longitudinal continuous measurements, with temporal decomposition to identify the phases and risk factors within each phase. Application of this model is illustrated using spirometry data after lung transplantation using readily available statistical software. This application illustrates the usefulness of our flexible model when dealing with complex non-linear patterns and time varying coefficients. PMID:24919830

  13. Applications of bioenergetics models to fish ecology and management: where do we go from here?

    USGS Publications Warehouse

    Hansen, Michael J.; Boisclair, Daniel; Brandt, Stephen B.; Hewett, Steven W.; Kitchell, James F.; Lucas, Martyn C.; Ney, John J.

    1993-01-01

    Papers and panel discussions given during a 1992 symposium on bioenergetics models are summarized. Bioenergetics models have been applied to a variety of research and management questions related to fish stocks, populations, food webs, and ecosystems. Applications include estimates of the intensity and dynamics of predator-prey interactions, nutrient cycling within aquatic food webs of varying trophic structure, and food requirements of single animals, whole populations, and communities of fishes. As tools in food web and ecosystem applications, bioenergetics models have been used to compare forage consumption by salmonid predators across the Laurentian Great Lakes for single populations and whole communities, and to estimate the growth potential of pelagic predators in Chesapeake Bay and Lake Ontario. Some critics say that bioenergetics models lack sufficient detail to produce reliable results in such field applications, whereas others say that the models are too complex to be useful tools for fishery managers. Nevertheless, bioenergetics models have achieved notable predictive successes. Improved estimates are needed for model parameters such as metabolic costs of activity, and more complete studies are needed of the bioenergetics of larval and juvenile fishes. Future research on bioenergetics should include laboratory and field measurements of key model parameters such as weight-dependent maximum consumption, respiration and activity, and thermal habitats actually occupied by fish. Future applications of bioenergetics models to fish populations also depend on accurate estimates of population sizes and survival rates.

  14. Brachytherapy treatment simulation of strontium-90 and ruthenium-106 plaques on small size posterior uveal melanoma using MCNPX code

    NASA Astrophysics Data System (ADS)

    Barbosa, N. A.; da Rosa, L. A. R.; Facure, A.; Braz, D.

    2014-02-01

    Concave eye applicators with 90Sr/90Y and 106Ru/106Rh beta-ray sources are usually used in brachytherapy for the treatment of superficial intraocular tumors as uveal melanoma with thickness up to 5 mm. The aim of this work consisted in using the Monte Carlo code MCNPX to calculate the 3D dose distribution on a mathematical model of the human eye, considering 90Sr/90Y and 160Ru/160Rh beta-ray eye applicators, in order to treat a posterior uveal melanoma with a thickness 3.8 mm from the choroid surface. Mathematical models were developed for the two ophthalmic applicators, CGD produced by BEBIG Company and SIA.6 produced by the Amersham Company, with activities 1 mCi and 4.23 mCi respectively. They have a concave form. These applicators' mathematical models were attached to the eye model and the dose distributions were calculated using the MCNPX *F8 tally. The average doses rates were determined in all regions of the eye model. The *F8 tally results showed that the deposited energy due to the applicator with the radionuclide 106Ru/106Rh is higher in all eye regions, including tumor. However the average dose rate in the tumor region is higher for the applicator with 90Sr/90Y, due to its high activity. Due to the dosimetric characteristics of these applicators, the PDD value for 3 mm water is 73% for the 106Ru/106Rh applicator and 60% for 90Sr/90Y applicator. For a better choice of the applicator type and radionuclide it is important to know the thickness of the tumor and its location.

  15. CHILDREN'S RESIDENTIAL EXPOSURE TO CHLORPYRIFOS: APPLICATION OF CPPAES FIELD MEASUREMENTS OF CHLORPYRIFOS AND TCPY WITHIN MENTOR/SHEDS PESTICIDES MODEL

    EPA Science Inventory

    The comprehensive individual field-measurements on non-dietary exposure collected in the Children's-Post-Pesticide-Application-Exposure-Study (CPPAES) were used within MENTOR/SHEDS-Pesticides, a physically based stochastic human exposure and dose model. In this application, howev...

  16. Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment (Final Report)

    EPA Science Inventory

    EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluati...

  17. Labyrinth, An Abstract Model for Hypermedia Applications. Description of its Static Components.

    ERIC Educational Resources Information Center

    Diaz, Paloma; Aedo, Ignacio; Panetsos, Fivos

    1997-01-01

    The model for hypermedia applications called Labyrinth allows: (1) the design of platform-independent hypermedia applications; (2) the categorization, generalization and abstraction of sparse unstructured heterogeneous information in multiple and interconnected levels; (3) the creation of personal views in multiuser hyperdocuments for both groups…

  18. Using Solution-Focused Applications for Transitional Coping of Workplace Survivors

    ERIC Educational Resources Information Center

    Germain, Marie-Line; Palamara, Sherry A.

    2007-01-01

    Solution-focused applications are proposed to assist survivor employees to return to workplace homeostasis after co-workers voluntarily or involuntarily leave the organization. A model for transitional coping is presented as well as a potential case study illustrating the application of the model. Implications for the theory, practice, and…

  19. AQMEII Phase 2: Overview and WRF/CMAQ Application over North America

    EPA Science Inventory

    In this study, we provide an overview of the second phase of the Air Quality Model Evaluation International Initiative (AQMEII). Activities in this phase are focused on the application and evaluation of coupled meteorologychemistry models. Participating modeling systems are being...

  20. Storm Water Management Model Applications Manual

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model that computes runoff quantity and quality from primarily urban areas. This manual is a practical application guide for new SWMM users who have already had some previous training in hydrolog...

  1. Dynamic Loading of Substation Distribution Transformers: An Application for use in a Production Grade Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Ming

    Recent trends in the electric power industry have led to more attention to optimal operation of power transformers. In a deregulated environment, optimal operation means minimizing the maintenance and extending the life of this critical and costly equipment for the purpose of maximizing profits. Optimal utilization of a transformer can be achieved through the use of dynamic loading. A benefit of dynamic loading is that it allows better utilization of the transformer capacity, thus increasing the flexibility and reliability of the power system. This document presents the progress on a software application which can estimate the maximum time-varying loading capability of transformers. This information can be used to load devices closer to their limits without exceeding the manufacturer specified operating limits. The maximally efficient dynamic loading of transformers requires a model that can accurately predict both top-oil temperatures (TOTs) and hottest-spot temperatures (HSTs). In the previous work, two kinds of thermal TOT and HST models have been studied and used in the application: the IEEE TOT/HST models and the ASU TOT/HST models. And, several metrics have been applied to evaluate the model acceptability and determine the most appropriate models for using in the dynamic loading calculations. In this work, an investigation to improve the existing transformer thermal models performance is presented. Some factors that may affect the model performance such as improper fan status and the error caused by the poor performance of IEEE models are discussed. Additional methods to determine the reliability of transformer thermal models using metrics such as time constant and the model parameters are also provided. A new production grade application for real-time dynamic loading operating purpose is introduced. This application is developed by using an existing planning application, TTeMP, as a start point, which is designed for the dispatchers and load specialists. To overcome the limitations of TTeMP, the new application can perform dynamic loading under emergency conditions, such as loss-of transformer loading. It also has the capability to determine the emergency rating of the transformers for a real-time estimation.

  2. Regionalization of land use impact models for life cycle assessment: Recommendations for their use on the global scale and their applicability to Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavan, Ana Laura Raymundo, E-mail: laurarpavan@gmail.com; Ometto, Aldo Roberto; Department of Production Engineering, São Carlos School of Engineering, University of São Paulo, Av. Trabalhador São-Carlense 400, São Carlos 13566-590, SP

    Life Cycle Assessment (LCA) is the main technique for evaluate the environmental impacts of product life cycles. A major challenge in the field of LCA is spatial and temporal differentiation in Life Cycle Impact Assessment (LCIA) methods, especially impacts resulting from land occupation and land transformation. Land use characterization modeling has advanced considerably over the last two decades and many approaches have recently included crucial aspects such as geographic differentiation. Nevertheless, characterization models have so far not been systematically reviewed and evaluated to determine their applicability to South America. Given that Brazil is the largest country in South America, thismore » paper analyzes the main international characterization models currently available in the literature, with a view to recommending regionalized models applicable on a global scale for land use life cycle impact assessments, and discusses their feasibility for regionalized assessment in Brazil. The analytical methodology involves classification based on the following criteria: midpoint/endpoint approach, scope of application, area of data collection, biogeographical differentiation, definition of recovery time and reference situation; followed by an evaluation of thirteen scientific robustness and environmental relevance subcriteria. The results of the scope of application are distributed among 25% of the models developed for the European context, and 50% have a global scope. There is no consensus in the literature about the definition of parameters such biogeographical differentiation and reference situation, and our review indicates that 35% of the models use ecoregion division while 40% use the concept of potential natural vegetation. Four characterization models show high scores in terms of scientific robustness and environmental relevance. These models are recommended for application in land use life cycle impact assessments, and also to serve as references for the development or adaptation of regional methodological procedures for Brazil. - Highlights: • A discussion is made on performing regionalized impact assessments using spatial differentiation in LCA. • A review is made of 20 characterization models for land use impacts in Life Cycle Impact Assessment. • Four characterization models are recommended according to different land use impact pathways for application in Brazil.« less

  3. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    NASA Astrophysics Data System (ADS)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  4. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  5. Proceedings of the Conference on Toxicology: Applications of Advances in Toxicology to Risk Assessment. Held at Wright-Patterson AFB, Ohio on 19-21 May 1992

    DTIC Science & Technology

    1993-01-01

    animals in toxicology research, the application of pharmacokinetics and physiologically based pharmacokinetic mdels in chemical risk assessment, selected...metaplasia Neurotoxicity Nonmutagenic carcinogens Ozone P450 PBPK modeling Perfluorohexane Peroxisome proliferators Pharmacokinetics Pharmacokinetic models...Physiological modeling Physiologically based pharmacokinetic modeling Polycyclic organic matter Quantitative risk assessment RAIRM model Rats

  6. Electro-thermal battery model identification for automotive applications

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Yurkovich, S.; Guezennec, Y.; Yurkovich, B. J.

    This paper describes a model identification procedure for identifying an electro-thermal model of lithium ion batteries used in automotive applications. The dynamic model structure adopted is based on an equivalent circuit model whose parameters are scheduled on the state-of-charge, temperature, and current direction. Linear spline functions are used as the functional form for the parametric dependence. The model identified in this way is valid inside a large range of temperatures and state-of-charge, so that the resulting model can be used for automotive applications such as on-board estimation of the state-of-charge and state-of-health. The model coefficients are identified using a multiple step genetic algorithm based optimization procedure designed for large scale optimization problems. The validity of the procedure is demonstrated experimentally for an A123 lithium ion iron-phosphate battery.

  7. Assessment of municipal solid waste settlement models based on field-scale data analysis.

    PubMed

    Bareither, Christopher A; Kwak, Seungbok

    2015-08-01

    An evaluation of municipal solid waste (MSW) settlement model performance and applicability was conducted based on analysis of two field-scale datasets: (1) Yolo and (2) Deer Track Bioreactor Experiment (DTBE). Twelve MSW settlement models were considered that included a range of compression behavior (i.e., immediate compression, mechanical creep, and biocompression) and range of total (2-22) and optimized (2-7) model parameters. A multi-layer immediate settlement analysis developed for Yolo provides a framework to estimate initial waste thickness and waste thickness at the end-of-immediate compression. Model application to the Yolo test cells (conventional and bioreactor landfills) via least squares optimization yielded high coefficient of determinations for all settlement models (R(2)>0.83). However, empirical models (i.e., power creep, logarithmic, and hyperbolic models) are not recommended for use in MSW settlement modeling due to potential non-representative long-term MSW behavior, limited physical significance of model parameters, and required settlement data for model parameterization. Settlement models that combine mechanical creep and biocompression into a single mathematical function constrain time-dependent settlement to a single process with finite magnitude, which limits model applicability. Overall, all models evaluated that couple multiple compression processes (immediate, creep, and biocompression) provided accurate representations of both Yolo and DTBE datasets. A model presented in Gourc et al. (2010) included the lowest number of total and optimized model parameters and yielded high statistical performance for all model applications (R(2)⩾0.97). Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. A matlab framework for estimation of NLME models using stochastic differential equations: applications for estimation of insulin secretion rates.

    PubMed

    Mortensen, Stig B; Klim, Søren; Dammann, Bernd; Kristensen, Niels R; Madsen, Henrik; Overgaard, Rune V

    2007-10-01

    The non-linear mixed-effects model based on stochastic differential equations (SDEs) provides an attractive residual error model, that is able to handle serially correlated residuals typically arising from structural mis-specification of the true underlying model. The use of SDEs also opens up for new tools for model development and easily allows for tracking of unknown inputs and parameters over time. An algorithm for maximum likelihood estimation of the model has earlier been proposed, and the present paper presents the first general implementation of this algorithm. The implementation is done in Matlab and also demonstrates the use of parallel computing for improved estimation times. The use of the implementation is illustrated by two examples of application which focus on the ability of the model to estimate unknown inputs facilitated by the extension to SDEs. The first application is a deconvolution-type estimation of the insulin secretion rate based on a linear two-compartment model for C-peptide measurements. In the second application the model is extended to also give an estimate of the time varying liver extraction based on both C-peptide and insulin measurements.

  9. Model and system learners, optimal process constructors and kinetic theory-based goal-oriented design: A new paradigm in materials and processes informatics

    NASA Astrophysics Data System (ADS)

    Abisset-Chavanne, Emmanuelle; Duval, Jean Louis; Cueto, Elias; Chinesta, Francisco

    2018-05-01

    Traditionally, Simulation-Based Engineering Sciences (SBES) has relied on the use of static data inputs (model parameters, initial or boundary conditions, … obtained from adequate experiments) to perform simulations. A new paradigm in the field of Applied Sciences and Engineering has emerged in the last decade. Dynamic Data-Driven Application Systems [9, 10, 11, 12, 22] allow the linkage of simulation tools with measurement devices for real-time control of simulations and applications, entailing the ability to dynamically incorporate additional data into an executing application, and in reverse, the ability of an application to dynamically steer the measurement process. It is in that context that traditional "digital-twins" are giving raise to a new generation of goal-oriented data-driven application systems, also known as "hybrid-twins", embracing models based on physics and models exclusively based on data adequately collected and assimilated for filling the gap between usual model predictions and measurements. Within this framework new methodologies based on model learners, machine learning and kinetic goal-oriented design are defining a new paradigm in materials, processes and systems engineering.

  10. CAD-CAM database management at Bendix Kansas City

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.R.

    1985-05-01

    The Bendix Kansas City Division of Allied Corporation began integrating mechanical CAD-CAM capabilities into its operations in June 1980. The primary capabilities include a wireframe modeling application, a solid modeling application, and the Bendix Integrated Computer Aided Manufacturing (BICAM) System application, a set of software programs and procedures which provides user-friendly access to graphic applications and data, and user-friendly sharing of data between applications and users. BICAM also provides for enforcement of corporate/enterprise policies. Three access categories, private, local, and global, are realized through the implementation of data-management metaphors: the desk, reading rack, file cabinet, and library are for themore » storage, retrieval, and sharing of drawings and models. Access is provided through menu selections; searching for designs is done by a paging method or a search-by-attribute-value method. The sharing of designs between all users of Part Data is key. The BICAM System supports 375 unique users per quarter and manages over 7500 drawings and models. The BICAM System demonstrates the need for generalized models, a high-level system framework, prototyping, information-modeling methods, and an understanding of the entire enterprise. Future BICAM System implementations are planned to take advantage of this knowledge.« less

  11. Model Data Interoperability for the United States Integrated Ocean Observing System (IOOS)

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.

    2010-05-01

    Model data interoperability for the United States Integrated Ocean Observing System (IOOS) was initiated with a focused one year project. The problem was that there were many regional and national providers of oceanographic model data; each had unique file conventions, distribution techniques and analysis tools that made it difficult to compare model results and observational data. To solve this problem, a distributed system was built utilizing a customized middleware layer and a common data model. This allowed each model data provider to keep their existing model and data files unchanged, yet deliver model data via web services in a common form. With standards-based applications that used these web services, end users then had a common way to access data from any of the models. These applications included: (1) a 2D mapping and animation using a web browser application, (2) an advanced 3D visualization and animation using a desktop application, and (3) a toolkit for a common scientific analysis environment. Due to the flexibility and low impact of the approach on providers, rapid progress was made. The system was implemented in all eleven US IOOS regions and at the NOAA National Coastal Data Development Center, allowing common delivery of regional and national oceanographic model forecast and archived results that cover all US waters. The system, based heavily on software technology from the NSF-sponsored Unidata Program Center, is applicable to any structured gridded data, not just oceanographic model data. There is a clear pathway to expand the system to include unstructured grid (e.g. triangular grid) data.

  12. Application Perspective of 2D+SCALE Dimension

    NASA Astrophysics Data System (ADS)

    Karim, H.; Rahman, A. Abdul

    2016-09-01

    Different applications or users need different abstraction of spatial models, dimensionalities and specification of their datasets due to variations of required analysis and output. Various approaches, data models and data structures are now available to support most current application models in Geographic Information System (GIS). One of the focuses trend in GIS multi-dimensional research community is the implementation of scale dimension with spatial datasets to suit various scale application needs. In this paper, 2D spatial datasets that been scaled up as the third dimension are addressed as 2D+scale (or 3D-scale) dimension. Nowadays, various data structures, data models, approaches, schemas, and formats have been proposed as the best approaches to support variety of applications and dimensionality in 3D topology. However, only a few of them considers the element of scale as their targeted dimension. As the scale dimension is concerned, the implementation approach can be either multi-scale or vario-scale (with any available data structures and formats) depending on application requirements (topology, semantic and function). This paper attempts to discuss on the current and new potential applications which positively could be integrated upon 3D-scale dimension approach. The previous and current works on scale dimension as well as the requirements to be preserved for any given applications, implementation issues and future potential applications forms the major discussion of this paper.

  13. Development and application of a technique for reducing airframe finite element models for dynamics analysis

    NASA Technical Reports Server (NTRS)

    Hashemi-Kia, Mostafa; Toossi, Mostafa

    1990-01-01

    A computational procedure for the reduction of large finite element models was developed. This procedure is used to obtain a significantly reduced model while retaining the essential global dynamic characteristics of the full-size model. This reduction procedure is applied to the airframe finite element model of AH-64A Attack Helicopter. The resulting reduced model is then validated by application to a vibration reduction study.

  14. Application of the Nelson model to four timelag fuel classes using Oklahoma field observations: Model evaluation and comparison with national Fire Danger Rating System algorithms

    Treesearch

    J. D. Carlson; Larry S. Bradshaw; Ralph M. Nelson; Randall R Bensch; Rafal Jabrzemski

    2007-01-01

    The application of a next-generation dead-fuel moisture model, the 'Nelson model', to four timelag fuel classes using an extensive 21-month dataset of dead-fuel moisture observations is described. Developed by Ralph Nelson in the 1990s, the Nelson model is a dead-fuel moisture model designed to take advantage of frequent automated weather observations....

  15. Applicability of central auditory processing disorder models.

    PubMed

    Jutras, Benoît; Loubert, Monique; Dupuis, Jean-Luc; Marcoux, Caroline; Dumont, Véronique; Baril, Michèle

    2007-12-01

    Central auditory processing disorder ([C]APD) is a relatively recent construct that has given rise to 2 theoretical models: the Buffalo Model and the Bellis/Ferre Model. These models describe 4 and 5 (C)APD categories, respectively. The present study examines the applicability of these models to clinical practice. Neither of these models was based on data from peer-reviewed sources. This is a retrospective study that reviewed 178 records of children diagnosed with (C)APD, of which 48 were retained for analysis. More than 80% of the children could be classified into one of the Buffalo Model categories, while more than 90% remained unclassified under the Bellis/Ferre Model. This discrepancy can be explained by the fact that the classification of the Buffalo Model is based primarily on a single central auditory test (Staggered Spondaic Word), whereas the Bellis/Ferre Model classification uses a combination of auditory test results. The 2 models provide a conceptual framework for (C)APD, but they must be further refined to be fully applicable in clinical settings.

  16. Model-based metabolism design: constraints for kinetic and stoichiometric models

    PubMed Central

    Stalidzans, Egils; Seiman, Andrus; Peebo, Karl; Komasilovs, Vitalijs; Pentjuss, Agris

    2018-01-01

    The implementation of model-based designs in metabolic engineering and synthetic biology may fail. One of the reasons for this failure is that only a part of the real-world complexity is included in models. Still, some knowledge can be simplified and taken into account in the form of optimization constraints to improve the feasibility of model-based designs of metabolic pathways in organisms. Some constraints (mass balance, energy balance, and steady-state assumption) serve as a basis for many modelling approaches. There are others (total enzyme activity constraint and homeostatic constraint) proposed decades ago, but which are frequently ignored in design development. Several new approaches of cellular analysis have made possible the application of constraints like cell size, surface, and resource balance. Constraints for kinetic and stoichiometric models are grouped according to their applicability preconditions in (1) general constraints, (2) organism-level constraints, and (3) experiment-level constraints. General constraints are universal and are applicable for any system. Organism-level constraints are applicable for biological systems and usually are organism-specific, but these constraints can be applied without information about experimental conditions. To apply experimental-level constraints, peculiarities of the organism and the experimental set-up have to be taken into account to calculate the values of constraints. The limitations of applicability of particular constraints for kinetic and stoichiometric models are addressed. PMID:29472367

  17. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  18. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  19. The MVP Model: Overview and Application

    ERIC Educational Resources Information Center

    Keller, John M.

    2017-01-01

    This chapter contains an overview of the MVP model that is used as a basis for the other chapters in this issue. It also contains a description of key steps in the ARCS-V design process that is derived from the MVP model and a summary of a design-based research study illustrating the application of the ARCS-V model.

  20. Enhancements to the Water Erosion Prediction Project (WEPP) for modeling large snow-dominated mountainous forest watersheds

    Treesearch

    Anurag Srivastava; Joan Q. Wu; William J. Elliot; Erin S. Brooks

    2015-01-01

    The Water Erosion Prediction Project (WEPP) model, originally developed for hillslope and small watershed applications, simulates complex interactive processes influencing erosion. Recent incorporations to the model have improved the subsurface hydrology components for forest applications. Incorporation of channel routing has made the WEPP model well suited for large...

  1. Nursing Models: Application to practice Alan Pearson , Barbara Vaughan and Mary Fitzgerald Nursing Models: Application to Practice Quay Books £24.99 280pp [Formula: see text].

    PubMed

    2010-10-07

    AT A time when evidence-based practice is the predominant nursing model, the authors of this book want to interest academics and practitioners in models that were in vogue in the Uk in the 1980s and 1990s.

  2. Nursing Models - Application to Practice Cutliffe John et al Nursing Models - Application to Practice 280pp Quay Books 9781856423793 1856423794 [Formula: see text].

    PubMed

    2010-09-22

    The authors set themselves the interesting challenge of reviving the interest of academics and practitioners in nursing models. Such models were in vogue in the UK in the 1980s and 1990s, at a time dominated by the evidence-based practice movement.

  3. Generic Business Model Types for Enterprise Mashup Intermediaries

    NASA Astrophysics Data System (ADS)

    Hoyer, Volker; Stanoevska-Slabeva, Katarina

    The huge demand for situational and ad-hoc applications desired by the mass of business end users led to a new kind of Web applications, well-known as Enterprise Mashups. Users with no or limited programming skills are empowered to leverage in a collaborative manner existing Mashup components by combining and reusing company internal and external resources within minutes to new value added applications. Thereby, Enterprise Mashup environments interact as intermediaries to match the supply of providers and demand of consumers. By following the design science approach, we propose an interaction phase model artefact based on market transaction phases to structure required intermediary features. By means of five case studies, we demonstrate the application of the designed model and identify three generic business model types for Enterprise Mashups intermediaries (directory, broker, and marketplace). So far, intermediaries following a real marketplace business model don’t exist in context of Enterprise Mashups and require further research for this emerging paradigm.

  4. Development of a unified oil droplet size distribution model with application to surface breaking waves and subsea blowout releases considering dispersant effects.

    PubMed

    Li, Zhengkai; Spaulding, Malcolm; French McCay, Deborah; Crowley, Deborah; Payne, James R

    2017-01-15

    An oil droplet size model was developed for a variety of turbulent conditions based on non-dimensional analysis of disruptive and restorative forces, which is applicable to oil droplet formation under both surface breaking-wave and subsurface-blowout conditions, with or without dispersant application. This new model was calibrated and successfully validated with droplet size data obtained from controlled laboratory studies of dispersant-treated and non-treated oil in subsea dispersant tank tests and field surveys, including the Deep Spill experimental release and the Deepwater Horizon blowout oil spill. This model is an advancement over prior models, as it explicitly addresses the effects of the dispersed phase viscosity, resulting from dispersant application and constrains the maximum stable droplet size based on Rayleigh-Taylor instability that is invoked for a release from a large aperture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...

  6. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...

  7. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...

  8. [Transmission dynamic model for echinococcosis granulosus: establishment and application].

    PubMed

    Yang, Shi-Jie; Wu, Wei-Ping

    2009-06-01

    A dynamic model of disease can be used to quantitatively describe the pattern and characteristics of disease transmission, predict the disease status and evaluate the efficacy of control strategy. This review summarizes the basic transmission dynamic models of echinococcosis granulosus and their application.

  9. Predictive microbiology for food packaging applications

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling has been applied to describe the microbial growth and inactivation in foods for decades and is also known as ‘Predictive microbiology’. When models are developed and validated, their applications may save cost and time. The Pathogen Modeling Program (PMP), a collection of mode...

  10. Systems in Science: Modeling Using Three Artificial Intelligence Concepts.

    ERIC Educational Resources Information Center

    Sunal, Cynthia Szymanski; Karr, Charles L.; Smith, Coralee; Sunal, Dennis W.

    2003-01-01

    Describes an interdisciplinary course focusing on modeling scientific systems. Investigates elementary education majors' applications of three artificial intelligence concepts used in modeling scientific systems before and after the course. Reveals a great increase in understanding of concepts presented but inconsistent application. (Author/KHR)

  11. Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application

    EPA Science Inventory

    The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...

  12. Toward a Model-Based Predictive Controller Design in Brain–Computer Interfaces

    PubMed Central

    Kamrunnahar, M.; Dias, N. S.; Schiff, S. J.

    2013-01-01

    A first step in designing a robust and optimal model-based predictive controller (MPC) for brain–computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8–23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications. PMID:21267657

  13. Toward a model-based predictive controller design in brain-computer interfaces.

    PubMed

    Kamrunnahar, M; Dias, N S; Schiff, S J

    2011-05-01

    A first step in designing a robust and optimal model-based predictive controller (MPC) for brain-computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8-23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications.

  14. Fertiliser management effects on dissolved inorganic nitrogen in runoff from Australian sugarcane farms.

    PubMed

    Fraser, Grant; Rohde, Ken; Silburn, Mark

    2017-08-01

    Dissolved inorganic nitrogen (DIN) movement from Australian sugarcane farms is believed to be a major cause of crown-of-thorns starfish outbreaks which have reduced the Great Barrier Reef coral cover by ~21% (1985-2012). We develop a daily model of DIN concentration in runoff based on >200 field monitored runoff events. Runoff DIN concentrations were related to nitrogen fertiliser application rates and decreased after application with time and cumulative rainfall. Runoff after liquid fertiliser applications had higher initial DIN concentrations, though these concentrations diminished more rapidly in comparison to granular fertiliser applications. The model was validated using an independent field dataset and provided reasonable estimates of runoff DIN concentrations based on a number of modelling efficiency score results. The runoff DIN concentration model was combined with a water balance cropping model to investigate temporal aspects of sugarcane fertiliser management. Nitrogen fertiliser application in December (start of wet season) had the highest risk of DIN movement, and this was further exacerbated in years with a climate forecast for 'wet' seasonal conditions. The potential utility of a climate forecasting system to predict forthcoming wet months and hence DIN loss risk is demonstrated. Earlier fertiliser application or reducing fertiliser application rates in seasons with a wet climate forecast may markedly reduce runoff DIN loads; however, it is recommended that these findings be tested at a broader scale.

  15. Application of nonlinear adaptive motion washout to transport ground-handling simulation

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Martin, D. J., Jr.

    1983-01-01

    The application of a nonlinear coordinated adaptive motion washout to the transport ground-handling environment is documented. Additions to both the aircraft math model and the motion washout system are discussed. The additions to the simulated-aircraft math model provided improved modeling fidelity for braking and reverse-thrust application, and the additions to the motion-base washout system allowed transition from the desired flight parameters to the less restrictive ground parameters of the washout.

  16. Superhydrophobic Drag Reduction in Various Turbulent Flows

    NASA Astrophysics Data System (ADS)

    Gose, James W.; Tuteja, Anish; Perlin, Marc; Ceccio, Steven L.

    2017-11-01

    Superhydrophobic surfaces (SHSs) have been studied exhaustively in laminar flow applications while interest in SHS drag reduction in turbulent flow applications has been increasing steadily. In this discussion, we will highlight recent advances of SHS applications in various high-Reynolds number flows. We will address the application of mechanically robust and scalable spray SHSs in three cases: fully-developed internal flow; a near-zero pressure gradient turbulent boundary layer; and an axisymmetric DARPA SUBOFF model. The model will be towed in the University of Michigan's Physical Model Basin. Experimental measurements of streamwise pressure drop and the near-wall flow via Particle Image Velocimetry and Laser Doppler Velocimetry will be discussed where applicable. Moreover, integral measurement of the total resistance of the SUBOFF model, with and without SHS application, will be examined. The SUBOFF model extends 2.6 m and is 0.3 m in diameter, and will be tested at water depths of three to six model diameters. Previous investigation of these SHSs have proven that skin-friction savings of 20% or more can be attained for friction Reynolds numbers greater than of 1,000. This project was carried out as part of the U.S. Office of Naval Research (ONR) MURI (Multidisciplinary University Research Initiatives) program (Grant No. N00014-12-1-0874) managed by Dr. Ki-Han Kim and led by Dr. Steven L. Ceccio.

  17. On the accuracy of models for predicting sound propagation in fitted rooms.

    PubMed

    Hodgson, M

    1990-08-01

    The objective of this article is to make a contribution to the evaluation of the accuracy and applicability of models for predicting the sound propagation in fitted rooms such as factories, classrooms, and offices. The models studied are 1:50 scale models; the method-of-image models of Jovicic, Lindqvist, Hodgson, Kurze, and of Lemire and Nicolas; the emprical formula of Friberg; and Ondet and Barbry's ray-tracing model. Sound propagation predictions by the analytic models are compared with the results of sound propagation measurements in a 1:50 scale model and in a warehouse, both containing various densities of approximately isotropically distributed, rectangular-parallelepipedic fittings. The results indicate that the models of Friberg and of Lemire and Nicolas are fundamentally incorrect. While more generally applicable versions exist, the versions of the models of Jovicic and Kurze studied here are found to be of limited applicability since they ignore vertical-wall reflections. The Hodgson and Lindqvist models appear to be accurate in certain limited cases. This preliminary study found the ray-tracing model of Ondet and Barbry to be the most accurate of all the cases studied. Furthermore, it has the necessary flexibility with respect to room geometry, surface-absorption distribution, and fitting distribution. It appears to be the model with the greatest applicability to fitted-room sound propagation prediction.

  18. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  19. The applicability of a computer model for predicting head injury incurred during actual motor vehicle collisions.

    PubMed

    Moran, Stephan G; Key, Jason S; McGwin, Gerald; Keeley, Jason W; Davidson, James S; Rue, Loring W

    2004-07-01

    Head injury is a significant cause of both morbidity and mortality. Motor vehicle collisions (MVCs) are the most common source of head injury in the United States. No studies have conclusively determined the applicability of computer models for accurate prediction of head injuries sustained in actual MVCs. This study sought to determine the applicability of such models for predicting head injuries sustained by MVC occupants. The Crash Injury Research and Engineering Network (CIREN) database was queried for restrained drivers who sustained a head injury. These collisions were modeled using occupant dynamic modeling (MADYMO) software, and head injury scores were generated. The computer-generated head injury scores then were evaluated with respect to the actual head injuries sustained by the occupants to determine the applicability of MADYMO computer modeling for predicting head injury. Five occupants meeting the selection criteria for the study were selected from the CIREN database. The head injury scores generated by MADYMO were lower than expected given the actual injuries sustained. In only one case did the computer analysis predict a head injury of a severity similar to that actually sustained by the occupant. Although computer modeling accurately simulates experimental crash tests, it may not be applicable for predicting head injury in actual MVCs. Many complicating factors surrounding actual MVCs make accurate computer modeling difficult. Future modeling efforts should consider variables such as age of the occupant and should account for a wider variety of crash scenarios.

  20. Computational Modeling for Enhancing Soft Tissue Image Guided Surgery: An Application in Neurosurgery.

    PubMed

    Miga, Michael I

    2016-01-01

    With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.

  1. Soft Tissue Structure Modelling for Use in Orthopaedic Applications and Musculoskeletal Biomechanics

    NASA Astrophysics Data System (ADS)

    Audenaert, E. A.; Mahieu, P.; van Hoof, T.; Pattyn, C.

    2009-12-01

    We present our methodology for the three-dimensional anatomical and geometrical description of soft tissues, relevant for orthopaedic surgical applications and musculoskeletal biomechanics. The technique involves the segmentation and geometrical description of muscles and neurovascular structures from high-resolution computer tomography scanning for the reconstruction of generic anatomical models. These models can be used for quantitative interpretation of anatomical and biomechanical aspects of different soft tissue structures. This approach should allow the use of these data in other application fields, such as musculoskeletal modelling, simulations for radiation therapy, and databases for use in minimally invasive, navigated and robotic surgery.

  2. Cost Optimization Model for Business Applications in Virtualized Grid Environments

    NASA Astrophysics Data System (ADS)

    Strebel, Jörg

    The advent of Grid computing gives enterprises an ever increasing choice of computing options, yet research has so far hardly addressed the problem of mixing the different computing options in a cost-minimal fashion. The following paper presents a comprehensive cost model and a mixed integer optimization model which can be used to minimize the IT expenditures of an enterprise and help in decision-making when to outsource certain business software applications. A sample scenario is analyzed and promising cost savings are demonstrated. Possible applications of the model to future research questions are outlined.

  3. Biotechnology on the Battlefield: An Application of Agent-based Modelling for Emerging Technology Assessment

    DTIC Science & Technology

    2015-03-01

    UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application of Agent-based Modelling for Emerging Technology Assessment...wounds might be treatable using advanced biotechnologies to control haemorrhaging and reduce blood-loss until medical evacuation can be completed. This...APPROVED FOR PUBLIC RELEASE UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application

  4. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for DMA program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Demand management (ATDM)...

  5. Using a Hydrological Model to Determine Environmentally Safer Windows for Herbicide Application

    Treesearch

    J.L. Michael; M.C. Smith; W.G. Knisel; D.G. Neary; W.P. Fowler; D.J. Turton

    1996-01-01

    A modification of the GLEAMS model was used to determine application windows which would optimise efficacy and environmental safety for herbicide application to a forest site. Herbicide/soil partition coefficients were determined using soil samples collected from the study site for two herbicides (imazapyr, Koc=46, triclopyr ester, K

  6. Application of the Social Marketing Model to Unemployment Counseling: A Theoretical Perspective

    ERIC Educational Resources Information Center

    Englert, Paul; Sommerville, Susannah; Guenole, Nigel

    2009-01-01

    A. R. Andreasen's (1995) social marketing model (SMM) is applied to structure feedback counseling for individuals who are unemployed. The authors discuss techniques used in commercial marketing and how they are equally applicable to solving societal problems; SMM and its application to social interventions; and structured feedback that moves a…

  7. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for ATDM program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Dynamic management (ATDM...

  8. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  9. Introduction: Special issue on advances in topobathymetric mapping, models, and applications

    USGS Publications Warehouse

    Gesch, Dean B.; Brock, John C.; Parrish, Christopher E.; Rogers, Jeffrey N.; Wright, C. Wayne

    2016-01-01

    Detailed knowledge of near-shore topography and bathymetry is required for many geospatial data applications in the coastal environment. New data sources and processing methods are facilitating development of seamless, regional-scale topobathymetric digital elevation models. These elevation models integrate disparate multi-sensor, multi-temporal topographic and bathymetric datasets to provide a coherent base layer for coastal science applications such as wetlands mapping and monitoring, sea-level rise assessment, benthic habitat mapping, erosion monitoring, and storm impact assessment. The focus of this special issue is on recent advances in the source data, data processing and integration methods, and applications of topobathymetric datasets.

  10. Remote sensing sensors and applications in environmental resources mapping and modeling

    USGS Publications Warehouse

    Melesse, Assefa M.; Weng, Qihao; Thenkabail, Prasad S.; Senay, Gabriel B.

    2007-01-01

    The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling.

  11. Evaluation Theory, Models, and Applications

    ERIC Educational Resources Information Center

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…

  12. “AQMEII Phase 2: Overview and WRF/CMAQ Application over North America”.

    EPA Science Inventory

    This presentation provides an overview of the second phase of the Air Quality Model Evaluation International Initative (AQMEII). Activities in this phase are focused on the application and evaluation of coupled meteorology-chemistry models to assess how well these models can simu...

  13. 78 FR 27198 - Applications for New Awards; National Institute on Disability and Rehabilitation Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... Rehabilitation Research--Traumatic Brain Injury Model Systems Centers Collaborative Research Project AGENCY... Brain Injury Model Systems Centers Collaborative Research Projects; Notice inviting applications for new... competition. Priority 1, the DRRP Priority for the Traumatic Brain Injury Model Systems Centers Collaborative...

  14. Approaches for Increasing Acceptance of Physiologically Based Pharmacokinetic Models in Public Health Risk Assessment

    EPA Science Inventory

    Physiologically based pharmacokinetic (PBPK) models have great potential for application in regulatory and non-regulatory public health risk assessment. The development and application of PBPK models in chemical toxicology has grown steadily since their emergence in the 1980s. Ho...

  15. A Practical Model for Forecasting New Freshman Enrollment during the Application Period.

    ERIC Educational Resources Information Center

    Paulsen, Michael B.

    1989-01-01

    A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)

  16. The Population Consequences of Disturbance Model Application to North Atlantic Right Whales (Eubalaena glacialis)

    DTIC Science & Technology

    2014-09-30

    from individuals to the population by way of changes in either behavior or physiology, and the revised approach is called PCOD (Population...include modeling fecundity, and exploring the feasibility of incorporating acoustic disturbance and prey variability into the PCOD model...the applicability of the model to assessing the effects of acoustics on the population. We have refined and applied the PCOD model developed for

  17. Long-term Simulation of Photo-oxidants and Particulate Matter Over Europe With The Eurad Modeling System

    NASA Astrophysics Data System (ADS)

    Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.

    During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.

  18. Models and applications for space weather forecasting and analysis at the Community Coordinated Modeling Center.

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Maria

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.

  19. Fault recovery in the reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Whetten, Brian

    1995-01-01

    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  20. [Research Progress and Prospect of Applications of Finite Element Method in Lumbar Spine Biomechanics].

    PubMed

    Zhang, Zhenjun; Li, Yang; Liao, Zhenhua; Liu, Weiqiang

    2016-12-01

    Based on the application of finite element analysis in spine biomechanics,the research progress of finite element method applied in lumbar spine mechanics is reviewed and the prospect is forecasted.The related works,including lumbar ontology modeling,clinical application research,and occupational injury and protection,are summarized.The main research areas of finite element method are as follows:new accurate modeling process,the optimized simulation method,diversified clinical effect evaluation,and the clinical application of artificial lumbar disc.According to the recent research progress,the application prospects of finite element method,such as automation and individuation of modeling process,evaluation and analysis of new operation methods and simulation of mechanical damage and dynamic response,are discussed.The purpose of this paper is to provide the theoretical reference and practical guidance for the clinical lumbar problems by reviewing the application of finite element method in the field of the lumbar spine biomechanics.

  1. Towards a standard design model for quad-rotors: A review of current models, their accuracy and a novel simplified model

    NASA Astrophysics Data System (ADS)

    Amezquita-Brooks, Luis; Liceaga-Castro, Eduardo; Gonzalez-Sanchez, Mario; Garcia-Salazar, Octavio; Martinez-Vazquez, Daniel

    2017-11-01

    Applications based on quad-rotor-vehicles (QRV) are becoming increasingly wide-spread. Many of these applications require accurate mathematical representations for control design, simulation and estimation. However, there is no consensus on a standardized model for these purposes. In this article a review of the most common elements included in QRV models reported in the literature is presented. This survey shows that some elements are recurrent for typical non-aerobatic QRV applications; in particular, for control design and high-performance simulation. By synthesising the common features of the reviewed models a standard generic model SGM is proposed. The SGM is cast as a typical state-space model without memory-less transformations, a structure which is useful for simulation and controller design. The survey also shows that many QRV applications use simplified representations, which may be considered simplifications of the SGM here proposed. In order to assess the effectiveness of the simplified models, a comprehensive comparison based on digital simulations is presented. With this comparison, it is possible to determine the accuracy of each model under particular operating ranges. Such information is useful for the selection of a model according to a particular application. In addition to the models found in the literature, in this article a novel simplified model is derived. The main characteristics of this model are that its inner dynamics are linear, it has low complexity and it has a high level of accuracy in all the studied operating ranges, a characteristic found only in more complex representations. To complement the article the main elements of the SGM are evaluated with the aid of experimental data and the computational complexity of all surveyed models is briefly analysed. Finally, the article presents a discussion on how the structural characteristics of the models are useful to suggest particular QRV control structures.

  2. Refinements in a viscoplastic model

    NASA Technical Reports Server (NTRS)

    Freed, A. D.; Walker, K. P.

    1989-01-01

    A thermodynamically admissible theory of viscoplasticity with two internal variables (a back stress and a drag strength) is presented. Six material functions characterize a specific viscoplastic model. In the pursuit of compromise between accuracy and simplicity, a model is developed that is a hybrid of two existing viscoplastic models. A limited number of applications of the model to Al, Cu, and Ni are presented. A novel implicit integration method is also discussed. Applications are made to obtain solutions using this viscoplastic model.

  3. Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J. 2011. Component-based development and sensitivity analyses of an air pollutant dry deposition model. Environmental Modelling & Software. 26(6): 804-816.

    Treesearch

    Satoshi Hirabayashi; Chuck Kroll; David Nowak

    2011-01-01

    The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...

  4. Nomogram to predict successful placement in surgical subspecialty fellowships using applicant characteristics.

    PubMed

    Muffly, Tyler M; Barber, Matthew D; Karafa, Matthew T; Kattan, Michael W; Shniter, Abigail; Jelovsek, J Eric

    2012-01-01

    The purpose of the study was to develop a model that predicts an individual applicant's probability of successful placement into a surgical subspecialty fellowship program. Candidates who applied to surgical fellowships during a 3-year period were identified in a set of databases that included the electronic application materials. Of the 1281 applicants who were available for analysis, 951 applicants (74%) successfully placed into a colon and rectal surgery, thoracic surgery, vascular surgery, or pediatric surgery fellowship. The optimal final prediction model, which was based on a logistic regression, included 14 variables. This model, with a c statistic of 0.74, allowed for the determination of a useful estimate of the probability of placement for an individual candidate. Of the factors that are available at the time of fellowship application, 14 were used to predict accurately the proportion of applicants who will successfully gain a fellowship position. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  6. Persistence of initial conditions in continental scale air quality simulations

    NASA Astrophysics Data System (ADS)

    Hogrefe, Christian; Roselle, Shawn J.; Bash, Jesse O.

    2017-07-01

    This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springtime and the second for summertime. Results show that a spin-up period of ten days commonly used in regional-scale applications may not be sufficient to reduce the effects of initial conditions to less than 1% of seasonally-averaged surface ozone concentrations everywhere while 20 days were found to be sufficient for the entire domain for the spring case and almost the entire domain for the summer case. For the summer case, differences were found to persist longer aloft due to circulation of air masses and even a spin-up period of 30 days was not sufficient to reduce the effects of ICs to less than 1% of seasonally-averaged layer 34 ozone concentrations over the southwestern portion of the modeling domain. Analysis of the effect of soil initial conditions for the CMAQ bidirectional NH3 exchange model shows that during springtime they can have an important effect on simulated inorganic aerosols concentrations for time periods of one month or longer. The effects are less pronounced during other seasons. The results, while specific to the modeling domain and time periods simulated here, suggest that modeling protocols need to be scrutinized for a given application and that it cannot be assumed that commonly-used spin-up periods are necessarily sufficient to reduce the effects of initial conditions on model results to an acceptable level. What constitutes an acceptable level of difference cannot be generalized and will depend on the particular application, time period and species of interest. Moreover, as the application of air quality models is being expanded to cover larger geographical domains and as these models are increasingly being coupled with other modeling systems to better represent air-surface-water exchanges, the effects of model initialization in such applications needs to be studied in future work.

  7. Targeting of sebaceous glands to treat acne by micro-insulated needles with radio frequency in a rabbit ear model.

    PubMed

    Kwon, Tae-Rin; Choi, Eun Ja; Oh, Chang Taek; Bak, Dong-Ho; Im, Song-I; Ko, Eun Jung; Hong, Hyuck Ki; Choi, Yeon Shik; Seok, Joon; Choi, Sun Young; Ahn, Gun Young; Kim, Beom Joon

    2017-04-01

    Many studies have investigated the application of micro-insulated needles with radio frequency (RF) to treat acne in humans; however, the use of a micro-insulated needle RF applicator has not yet been studied in an animal model. The purpose of this study was to evaluate the effectiveness of a micro-insulated needle RF applicator in a rabbit ear acne (REA) model. In this study, we investigated the effect of selectively destroying the sebaceous glands using a micro-insulated needle RF applicator on the formation of comedones induced by application of 50% oleic acid and intradermal injection of P. acnes in the orifices of the external auditory canals of rabbits. The effects of the micro-insulated needle RF applicator treatment were evaluated using regular digital photography in addition to 3D Primos imaging evaluation, Skin Visio Meter microscopic photography, and histologic analyses. Use of the micro-insulated needle RF applicator resulted in successful selective destruction of the sebaceous glands and attenuated TNF-alpha release in an REA model. The mechanisms by which micro-insulated needles with RF using 1 MHz exerts its effects may involve inhibition of comedone formation, triggering of the wound healing process, and destruction of the sebaceous glands and papules. The use of micro-insulated needles with RF applicators provides a safe and effective method for improving the appearance of symptoms in an REA model. The current in vivo study confirms that the micro-insulated needle RF applicator is selectively destroying the sebaceous glands. Lasers Surg. Med. 49:395-401, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. A Tutorial on RxODE: Simulating Differential Equation Pharmacometric Models in R.

    PubMed

    Wang, W; Hallow, K M; James, D A

    2016-01-01

    This tutorial presents the application of an R package, RxODE, that facilitates quick, efficient simulations of ordinary differential equation models completely within R. Its application is illustrated through simulation of design decision effects on an adaptive dosing regimen. The package provides an efficient, versatile way to specify dosing scenarios and to perform simulation with variability with minimal custom coding. Models can be directly translated to Rshiny applications to facilitate interactive, real-time evaluation/iteration on simulation scenarios.

  9. Applicator modeling for electromagnetic thermotherapy of cervix cancer.

    PubMed

    Rezaeealam, Behrooz

    2015-03-01

    This report proposes an induction heating coil design that can be used for producing strong magnetic fields around ferromagnetic implants located in the cervix of uterus. The effect of coil design on the uniformity and extent of heat generation ability is inspected. Also, a numerical model of the applicator is developed that includes the ferromagnetic implants, and is coupled to the bioheat transfer model of the body tissue. Then, the ability of the proposed applicator for electromagnetic thermotherapy is investigated.

  10. Surface-Potential-Based Metal-Oxide-Silicon-Varactor Model for RF Applications

    NASA Astrophysics Data System (ADS)

    Miyake, Masataka; Sadachika, Norio; Navarro, Dondee; Mizukane, Yoshio; Matsumoto, Kenji; Ezaki, Tatsuya; Miura-Mattausch, Mitiko; Mattausch, Hans Juergen; Ohguro, Tatsuya; Iizuka, Takahiro; Taguchi, Masahiko; Kumashiro, Shigetaka; Miyamoto, Shunsuke

    2007-04-01

    We have developed a surface-potential-based metal-oxide-silicon (MOS)-varactor model valid for RF applications up to 200 GHz. The model enables the calculation of the MOS-varactor capacitance seamlessly from the depletion region to the accumulation region and explicitly considers the carrier-response delay causing a non-quasi-static (NQS) effect. It has been observed that capacitance reduction due to this non-quasi-static effect limits the MOS-varactor application to an RF regime.

  11. TRIGRS Application for landslide susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Sugiarti, K.; Sukristiyanti, S.

    2018-02-01

    Research on landslide susceptibility has been carried out using several different methods. TRIGRS is a modeling program for landslide susceptibility by considering pore water pressure changes due to infiltration of rainfall. This paper aims to present a current state-of-the-art science on the development and application of TRIGRS. Some limitations of TRIGRS, some developments of it to improve its modeling capability, and some examples of the applications of some versions of it to model the effect of rainfall variation on landslide susceptibility are reviewed and discussed.

  12. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  13. Skill Assessment for Coupled Biological/Physical Models of Marine Systems.

    PubMed

    Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip

    2009-02-20

    Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.

  14. In vitro skin models and tissue engineering protocols for skin graft applications.

    PubMed

    Naves, Lucas B; Dhand, Chetna; Almeida, Luis; Rajamani, Lakshminarayanan; Ramakrishna, Seeram

    2016-11-30

    In this review, we present a brief introduction of the skin structure, a concise compilation of skin-related disorders, and a thorough discussion of different in vitro skin models, artificial skin substitutes, skin grafts, and dermal tissue engineering protocols. The advantages of the development of in vitro skin disorder models, such as UV radiation and the prototype model, melanoma model, wound healing model, psoriasis model, and full-thickness model are also discussed. Different types of skin grafts including allografts, autografts, allogeneic, and xenogeneic are described in detail with their associated applications. We also discuss different tissue engineering protocols for the design of various types of skin substitutes and their commercial outcomes. Brief highlights are given of the new generation three-dimensional printed scaffolds for tissue regeneration applications. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  15. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  16. An Application of a Multidimensional Extension of the Two-Parameter Logistic Latent Trait Model.

    ERIC Educational Resources Information Center

    McKinley, Robert L.; Reckase, Mark D.

    A latent trait model is described that is appropriate for use with tests that measure more than one dimension, and its application to both real and simulated test data is demonstrated. Procedures for estimating the parameters of the model are presented. The research objectives are to determine whether the two-parameter logistic model more…

  17. Beyond Model Answers: Learners' Perceptions of Self-Assessment Materials in E-Learning Applications

    ERIC Educational Resources Information Center

    Handley, Karen; Cox, Benita

    2007-01-01

    The importance of feedback as an aid to self-assessment is widely acknowledged. A common form of feedback that is used widely in e-learning is the use of model answers. However, model answers are deficient in many respects. In particular, the notion of a "model" answer implies the existence of a single correct answer applicable across multiple…

  18. The DO ART Model: An Ethical Decision-Making Model Applicable to Art Therapy

    ERIC Educational Resources Information Center

    Hauck, Jessica; Ling, Thomson

    2016-01-01

    Although art therapists have discussed the importance of taking a positive stance in terms of ethical decision making (Hinz, 2011), an ethical decision-making model applicable for the field of art therapy has yet to emerge. As the field of art therapy continues to grow, an accessible, theoretically grounded, and logical decision-making model is…

  19. Crop modeling applications in agricultural water management

    USGS Publications Warehouse

    Kisekka, Isaya; DeJonge, Kendall C.; Ma, Liwang; Paz, Joel; Douglas-Mankin, Kyle R.

    2017-01-01

    This article introduces the fourteen articles that comprise the “Crop Modeling and Decision Support for Optimizing Use of Limited Water” collection. This collection was developed from a special session on crop modeling applications in agricultural water management held at the 2016 ASABE Annual International Meeting (AIM) in Orlando, Florida. In addition, other authors who were not able to attend the 2016 ASABE AIM were also invited to submit papers. The articles summarized in this introductory article demonstrate a wide array of applications in which crop models can be used to optimize agricultural water management. The following section titles indicate the topics covered in this collection: (1) evapotranspiration modeling (one article), (2) model development and parameterization (two articles), (3) application of crop models for irrigation scheduling (five articles), (4) coordinated water and nutrient management (one article), (5) soil water management (two articles), (6) risk assessment of water-limited irrigation management (one article), and (7) regional assessments of climate impact (two articles). Changing weather and climate, increasing population, and groundwater depletion will continue to stimulate innovations in agricultural water management, and crop models will play an important role in helping to optimize water use in agriculture.

  20. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  1. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  2. The sound of friction: Real-time models, playability and musical applications

    NASA Astrophysics Data System (ADS)

    Serafin, Stefania

    Friction, the tangential force between objects in contact, in most engineering applications needs to be removed as a source of noise and instabilities. In musical applications, friction is a desirable component, being the sound production mechanism of different musical instruments such as bowed strings, musical saws, rubbed bowls and any other sonority produced by interactions between rubbed dry surfaces. The goal of the dissertation is to simulate different instrument whose main excitation mechanism is friction. An efficient yet accurate model of a bowed string instrument, which combines the latest results in violin acoustics with the efficient digital waveguide approach, is provided. In particular, the bowed string physical model proposed uses a thermodynamic friction model in which the finite width of the bow is taken into account; this solution is compared to the recently developed elasto-plastic friction models used in haptics and robotics. Different solutions are also proposed to model the body of the instrument. Other less common instruments driven by friction are also proposed, and the elasto-plastic model is used to provide audio-visual simulations of everyday friction sounds such as squeaking doors and rubbed wine glasses. Finally, playability evaluations and musical applications in which the models have been used are discussed.

  3. Atmospheric Models for Aeroentry and Aeroassist

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2005-01-01

    Eight destinations in the Solar System have sufficient atmosphere for aeroentry, aeroassist, or aerobraking/aerocapture: Venus, Earth, Mars, Jupiter, Saturn, Uranus, and Neptune, plus Saturn's moon Titan. Engineering-level atmospheric models for Earth, Mars, Titan, and Neptune have been developed for use in NASA's systems analysis studies of aerocapture applications. Development has begun on a similar atmospheric model for Venus. An important capability of these models is simulation of quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Characteristics of these atmospheric models are compared, and example applications for aerocapture are presented. Recent Titan atmospheric model updates are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan. Recent and planned updates to the Mars atmospheric model, in support of future Mars aerocapture systems analysis studies, are also presented.

  4. A linear goal programming model for human resource allocation in a health-care organization.

    PubMed

    Kwak, N K; Lee, C

    1997-06-01

    This paper presents the development of a goal programming (GP) model as an aid to strategic planning and allocation for limited human resources in a health-care organization. The purpose of this study is to assign the personnel to the proper shift hours that enable management to meet the objective of minimizing the total payroll costs while patients are satisfied. A GP model is illustrated using the data provided by a health-care organization in the midwest area. The goals are identified and prioritized. The model result is examined and a sensitivity analysis is performed to improve the model applicability. The GP model application adds insight to the planning functions of resource allocation in the health-care organizations. The proposed model is easily applicable to other human resource planning process.

  5. Thermoviscoplastic model with application to copper

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.

    1988-01-01

    A viscoplastic model is developed which is applicable to anisothermal, cyclic, and multiaxial loading conditions. Three internal state variables are used in the model; one to account for kinematic effects, and the other two to account for isotropic effects. One of the isotropic variables is a measure of yield strength, while the other is a measure of limit strength. Each internal state variable evolves through a process of competition between strain hardening and recovery. There is no explicit coupling between dynamic and thermal recovery in any evolutionary equation, which is a useful simplification in the development of the model. The thermodynamic condition of intrinsic dissipation constrains the thermal recovery function of the model. Application of the model is made to copper, and cyclic experiments under isothermal, thermomechanical, and nonproportional loading conditions are considered. Correlations and predictions of the model are representative of observed material behavior.

  6. Capturing nonlocal interaction effects in the Hubbard model: Optimal mappings and limits of applicability

    NASA Astrophysics Data System (ADS)

    van Loon, E. G. C. P.; Schüler, M.; Katsnelson, M. I.; Wehling, T. O.

    2016-10-01

    We investigate the Peierls-Feynman-Bogoliubov variational principle to map Hubbard models with nonlocal interactions to effective models with only local interactions. We study the renormalization of the local interaction induced by nearest-neighbor interaction and assess the quality of the effective Hubbard models in reproducing observables of the corresponding extended Hubbard models. We compare the renormalization of the local interactions as obtained from numerically exact determinant quantum Monte Carlo to approximate but more generally applicable calculations using dual boson, dynamical mean field theory, and the random phase approximation. These more approximate approaches are crucial for any application with real materials in mind. Furthermore, we use the dual boson method to calculate observables of the extended Hubbard models directly and benchmark these against determinant quantum Monte Carlo simulations of the effective Hubbard model.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carothers, Christopher D.; Meredith, Jeremy S.; Blanco, Marc

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, thatmore » combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when executing both torus and dragonfly network models on up to 4K Blue Gene/Q nodes using 32K MPI ranks, Durango also avoids the overheads and complexities associated with extreme-scale trace files.« less

  8. Application of a hybrid MPI/OpenMP approach for parallel groundwater model calibration using multi-core computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan

    2010-01-01

    Calibration of groundwater models involves hundreds to thousands of forward solutions, each of which may solve many transient coupled nonlinear partial differential equations, resulting in a computationally intensive problem. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelisms in software and hardware to reduce calibration time on multi-core computers. HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for direct solutions for a reactive transport model application, and a field-scale coupled flow and transport model application. In the reactive transport model, a single parallelizable loop is identified to account for over 97% of the total computational time using GPROF.more » Addition of a few lines of OpenMP compiler directives to the loop yields a speedup of about 10 on a 16-core compute node. For the field-scale model, parallelizable loops in 14 of 174 HGC5 subroutines that require 99% of the execution time are identified. As these loops are parallelized incrementally, the scalability is found to be limited by a loop where Cray PAT detects over 90% cache missing rates. With this loop rewritten, similar speedup as the first application is achieved. The OpenMP-parallelized code can be run efficiently on multiple workstations in a network or multiple compute nodes on a cluster as slaves using parallel PEST to speedup model calibration. To run calibration on clusters as a single task, the Levenberg Marquardt algorithm is added to HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, 100 200 compute cores are used to reduce the calibration time from weeks to a few hours for these two applications. This approach is applicable to most of the existing groundwater model codes for many applications.« less

  9. Applications of system dynamics modelling to support health policy.

    PubMed

    Atkinson, Jo-An M; Wells, Robert; Page, Andrew; Dominello, Amanda; Haines, Mary; Wilson, Andrew

    2015-07-09

    The value of systems science modelling methods in the health sector is increasingly being recognised. Of particular promise is the potential of these methods to improve operational aspects of healthcare capacity and delivery, analyse policy options for health system reform and guide investments to address complex public health problems. Because it lends itself to a participatory approach, system dynamics modelling has been a particularly appealing method that aims to align stakeholder understanding of the underlying causes of a problem and achieve consensus for action. The aim of this review is to determine the effectiveness of system dynamics modelling for health policy, and explore the range and nature of its application. A systematic search was conducted to identify articles published up to April 2015 from the PubMed, Web of Knowledge, Embase, ScienceDirect and Google Scholar databases. The grey literature was also searched. Papers eligible for inclusion were those that described applications of system dynamics modelling to support health policy at any level of government. Six papers were identified, comprising eight case studies of the application of system dynamics modelling to support health policy. No analytic studies were found that examined the effectiveness of this type of modelling. Only three examples engaged multidisciplinary stakeholders in collective model building. Stakeholder participation in model building reportedly facilitated development of a common 'mental map' of the health problem, resulting in consensus about optimal policy strategy and garnering support for collaborative action. The paucity of relevant papers indicates that, although the volume of descriptive literature advocating the value of system dynamics modelling is considerable, its practical application to inform health policy making is yet to be routinely applied and rigorously evaluated. Advances in software are allowing the participatory model building approach to be extended to more sophisticated multimethod modelling that provides policy makers with more powerful tools to support the design of targeted, effective and equitable policy responses for complex health problems. Building capacity and investing in communication to promote these modelling methods, as well as documenting and evaluating their applications, will be vital to supporting uptake by policy makers.

  10. Conceptual Model Learning Objects and Design Recommendations for Small Screens

    ERIC Educational Resources Information Center

    Churchill, Daniel

    2011-01-01

    This article presents recommendations for the design of conceptual models for applications via handheld devices such as personal digital assistants and some mobile phones. The recommendations were developed over a number of years through experience that involves design of conceptual models, and applications of these multimedia representations with…

  11. Urban development applications project. Urban technology transfer study

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Technology transfer is defined along with reasons for attempting to transfer technology. Topics discussed include theoretical models, stages of the innovation model, communication process model, behavior of industrial organizations, problem identification, technology search and match, establishment of a market mechanism, applications engineering, commercialization, and management of technology transfer.

  12. Semantic Description of Educational Adaptive Hypermedia Based on a Conceptual Model

    ERIC Educational Resources Information Center

    Papasalouros, Andreas; Retalis, Symeon; Papaspyrou, Nikolaos

    2004-01-01

    The role of conceptual modeling in Educational Adaptive Hypermedia Applications (EAHA) is especially important. A conceptual model of an educational application depicts the instructional solution that is implemented, containing information about concepts that must be ac-quired by learners, tasks in which learners must be involved and resources…

  13. Response to Intervention and the Pyramid Model

    ERIC Educational Resources Information Center

    Fox, Lise; Carta, Judith; Strain, Phil; Dunlap, Glen; Hemmeter, Mary Louise

    2009-01-01

    Response to Intervention (RtI) offers a comprehensive model for the prevention of delays in learning and behavior. While this problem-solving framework was initially designed for application within Kindergarten to 12th grade programs, there is substantial research that supports the value of the model for application within early childhood…

  14. High-throughput screening (HTS) and modeling of the retinoid ...

    EPA Pesticide Factsheets

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  15. 76 FR 53137 - Bundled Payments for Care Improvement Initiative: Request for Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-25

    ... (RFA) will test episode-based payment for acute care and associated post-acute care, using both retrospective and prospective bundled payment methods. The RFA requests applications to test models centered around acute care; these models will inform the design of future models, including care improvement for...

  16. FINE SCALE AIR QUALITY MODELING USING DISPERSION AND CMAQ MODELING APPROACHES: AN EXAMPLE APPLICATION IN WILMINGTON, DE

    EPA Science Inventory

    Characterization of spatial variability of air pollutants in an urban setting at fine scales is critical for improved air toxics exposure assessments, for model evaluation studies and also for air quality regulatory applications. For this study, we investigate an approach that su...

  17. Models of the Organizational Life Cycle: Applications to Higher Education.

    ERIC Educational Resources Information Center

    Cameron, Kim S.; Whetten, David A.

    1983-01-01

    A review of models of group and organization life cycle development is provided and the applicability of those models for institutions of higher education are discussed. An understanding of the problems and characteristics present in different life cycle stages can help institutions manage transitions more effectively. (Author/MLW)

  18. Annual Application and Evaluation of the Online Coupled WRF‐CMAQ System over North America under AQMEII Phase 2

    EPA Science Inventory

    We present an application of the online coupled WRF-CMAQ modeling system to two annual simulations over North America performed under Phase 2 of the Air Quality Model Evaluation International Initiative (AQMEII). Operational evaluation shows that model performance is comparable t...

  19. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  20. Development of Semantic Description for Multiscale Models of Thermo-Mechanical Treatment of Metal Alloys

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Regulski, Krzysztof

    2016-08-01

    We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.

  1. A Comprehensive Validation Methodology for Sparse Experimental Data

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Blattnig, Steve R.

    2010-01-01

    A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.

  2. Finite elements of nonlinear continua.

    NASA Technical Reports Server (NTRS)

    Oden, J. T.

    1972-01-01

    The finite element method is extended to a broad class of practical nonlinear problems, treating both theory and applications from a general and unifying point of view. The thermomechanical principles of continuous media and the properties of the finite element method are outlined, and are brought together to produce discrete physical models of nonlinear continua. The mathematical properties of the models are analyzed, and the numerical solution of the equations governing the discrete models is examined. The application of the models to nonlinear problems in finite elasticity, viscoelasticity, heat conduction, and thermoviscoelasticity is discussed. Other specific topics include the topological properties of finite element models, applications to linear and nonlinear boundary value problems, convergence, continuum thermodynamics, finite elasticity, solutions to nonlinear partial differential equations, and discrete models of the nonlinear thermomechanical behavior of dissipative media.

  3. Emission of pesticides into the air

    USGS Publications Warehouse

    Van Den, Berg; Kubiak, R.; Benjey, W.G.; Majewski, M.S.; Yates, S.R.; Reeves, G.L.; Smelt, J.H.; Van Der Linden, A. M. A.

    1999-01-01

    During and after the application of a pesticide in agriculture, a substantial fraction of the dosage may enter the atmosphere and be transported over varying distances downwind of the target. The rate and extent of the emission during application, predominantly as spray particle drift, depends primarily on the application method (equipment and technique), the formulation and environmental conditions, whereas the emission after application depends primarily on the properties of the pesticide, soils, crops and environmental conditions. The fraction of the dosage that misses the target area may be high in some cases and more experimental data on this loss term are needed for various application types and weather conditions. Such data are necessary to test spray drift models, and for further model development and verification as well. Following application, the emission of soil fumigants and soil incorporated pesticides into the air can be measured and computed with reasonable accuracy, but further model development is needed to improve the reliability of the model predictions. For soil surface applied pesticides reliable measurement methods are available, but there is not yet a reliable model. Further model development is required which must be verified by field experiments. Few data are available on pesticide volatilization from plants and more field experiments are also needed to study the fate processes on the plants. Once this information is available, a model needs to be developed to predict the volatilization of pesticides from plants, which, again, should be verified with field measurements. For regional emission estimates, a link between data on the temporal and spatial pesticide use and a geographical information system for crops and soils with their characteristics is needed.

  4. A generic bio-economic farm model for environmental and economic assessment of agricultural systems.

    PubMed

    Janssen, Sander; Louhichi, Kamel; Kanellopoulos, Argyris; Zander, Peter; Flichman, Guillermo; Hengsdijk, Huib; Meuter, Eelco; Andersen, Erling; Belhouchette, Hatem; Blanco, Maria; Borkowski, Nina; Heckelei, Thomas; Hecker, Martin; Li, Hongtao; Oude Lansink, Alfons; Stokstad, Grete; Thorne, Peter; van Keulen, Herman; van Ittersum, Martin K

    2010-12-01

    Bio-economic farm models are tools to evaluate ex-post or to assess ex-ante the impact of policy and technology change on agriculture, economics and environment. Recently, various BEFMs have been developed, often for one purpose or location, but hardly any of these models are re-used later for other purposes or locations. The Farm System Simulator (FSSIM) provides a generic framework enabling the application of BEFMs under various situations and for different purposes (generating supply response functions and detailed regional or farm type assessments). FSSIM is set up as a component-based framework with components representing farmer objectives, risk, calibration, policies, current activities, alternative activities and different types of activities (e.g., annual and perennial cropping and livestock). The generic nature of FSSIM is evaluated using five criteria by examining its applications. FSSIM has been applied for different climate zones and soil types (criterion 1) and to a range of different farm types (criterion 2) with different specializations, intensities and sizes. In most applications FSSIM has been used to assess the effects of policy changes and in two applications to assess the impact of technological innovations (criterion 3). In the various applications, different data sources, level of detail (e.g., criterion 4) and model configurations have been used. FSSIM has been linked to an economic and several biophysical models (criterion 5). The model is available for applications to other conditions and research issues, and it is open to be further tested and to be extended with new components, indicators or linkages to other models.

  5. A Generic Bio-Economic Farm Model for Environmental and Economic Assessment of Agricultural Systems

    PubMed Central

    Louhichi, Kamel; Kanellopoulos, Argyris; Zander, Peter; Flichman, Guillermo; Hengsdijk, Huib; Meuter, Eelco; Andersen, Erling; Belhouchette, Hatem; Blanco, Maria; Borkowski, Nina; Heckelei, Thomas; Hecker, Martin; Li, Hongtao; Oude Lansink, Alfons; Stokstad, Grete; Thorne, Peter; van Keulen, Herman; van Ittersum, Martin K.

    2010-01-01

    Bio-economic farm models are tools to evaluate ex-post or to assess ex-ante the impact of policy and technology change on agriculture, economics and environment. Recently, various BEFMs have been developed, often for one purpose or location, but hardly any of these models are re-used later for other purposes or locations. The Farm System Simulator (FSSIM) provides a generic framework enabling the application of BEFMs under various situations and for different purposes (generating supply response functions and detailed regional or farm type assessments). FSSIM is set up as a component-based framework with components representing farmer objectives, risk, calibration, policies, current activities, alternative activities and different types of activities (e.g., annual and perennial cropping and livestock). The generic nature of FSSIM is evaluated using five criteria by examining its applications. FSSIM has been applied for different climate zones and soil types (criterion 1) and to a range of different farm types (criterion 2) with different specializations, intensities and sizes. In most applications FSSIM has been used to assess the effects of policy changes and in two applications to assess the impact of technological innovations (criterion 3). In the various applications, different data sources, level of detail (e.g., criterion 4) and model configurations have been used. FSSIM has been linked to an economic and several biophysical models (criterion 5). The model is available for applications to other conditions and research issues, and it is open to be further tested and to be extended with new components, indicators or linkages to other models. PMID:21113782

  6. Improved cyberinfrastructure for integrated hydrometeorological predictions within the fully-coupled WRF-Hydro modeling system

    NASA Astrophysics Data System (ADS)

    gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh

    2014-05-01

    The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.

  7. Application distribution model and related security attacks in VANET

    NASA Astrophysics Data System (ADS)

    Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian

    2013-03-01

    In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.

  8. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    PubMed Central

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  9. TOPICAL REVIEW: Modelling the interaction of electromagnetic fields (10 MHz 10 GHz) with the human body: methods and applications

    NASA Astrophysics Data System (ADS)

    Hand, J. W.

    2008-08-01

    Numerical modelling of the interaction between electromagnetic fields (EMFs) and the dielectrically inhomogeneous human body provides a unique way of assessing the resulting spatial distributions of internal electric fields, currents and rate of energy deposition. Knowledge of these parameters is of importance in understanding such interactions and is a prerequisite when assessing EMF exposure or when assessing or optimizing therapeutic or diagnostic medical applications that employ EMFs. In this review, computational methods that provide this information through full time-dependent solutions of Maxwell's equations are summarized briefly. This is followed by an overview of safety- and medical-related applications where modelling has contributed significantly to development and understanding of the techniques involved. In particular, applications in the areas of mobile communications, magnetic resonance imaging, hyperthermal therapy and microwave radiometry are highlighted. Finally, examples of modelling the potentially new medical applications of recent technologies such as ultra-wideband microwaves are discussed.

  10. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Liao, Wei-keng

    Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less

  11. Fast All-Sky Radiation Model for Solar Applications (FARMS): A Brief Overview of Mechanisms, Performance, and Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit

    Solar radiation can be computed using radiative transfer models, such as the Rapid Radiation Transfer Model (RRTM) and its general circulation model applications, and used for various energy applications. Due to the complexity of computing radiation fields in aerosol and cloudy atmospheres, simulating solar radiation can be extremely time-consuming, but many approximations--e.g., the two-stream approach and the delta-M truncation scheme--can be utilized. To provide a new fast option for computing solar radiation, we developed the Fast All-sky Radiation Model for Solar applications (FARMS) by parameterizing the simulated diffuse horizontal irradiance and direct normal irradiance for cloudy conditions from the RRTMmore » runs using a 16-stream discrete ordinates radiative transfer method. The solar irradiance at the surface was simulated by combining the cloud irradiance parameterizations with a fast clear-sky model, REST2. To understand the accuracy and efficiency of the newly developed fast model, we analyzed FARMS runs using cloud optical and microphysical properties retrieved using GOES data from 2009-2012. The global horizontal irradiance for cloudy conditions was simulated using FARMS and RRTM for global circulation modeling with a two-stream approximation and compared to measurements taken from the U.S. Department of Energy's Atmospheric Radiation Measurement Climate Research Facility Southern Great Plains site. Our results indicate that the accuracy of FARMS is comparable to or better than the two-stream approach; however, FARMS is approximately 400 times more efficient because it does not explicitly solve the radiative transfer equation for each individual cloud condition. Radiative transfer model runs are computationally expensive, but this model is promising for broad applications in solar resource assessment and forecasting. It is currently being used in the National Solar Radiation Database, which is publicly available from the National Renewable Energy Laboratory at http://nsrdb.nrel.gov.« less

  12. High-Fidelity Roadway Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit

    2010-01-01

    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  13. 40 CFR 86.1807-01 - Vehicle labeling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Conforms to U.S. EPA Regulations Applicable to XXX-Fueled 20XX Model Year New Motor Vehicles.” (B) For light-duty trucks, the statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX...: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-fueled 20XX Model Year New Medium-Duty...

  14. 40 CFR 86.1807-01 - Vehicle labeling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Conforms to U.S. EPA Regulations Applicable to XXX-Fueled 20XX Model Year New Motor Vehicles.” (B) For light-duty trucks, the statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX...: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-fueled 20XX Model Year New Medium-Duty...

  15. Using GLEAMS to Select Environmental Windows for Herbicide Application in Forests

    Treesearch

    M.C. Smith; J.L. Michael; W.G. Koisel; D.G. Nealy

    1994-01-01

    Observed herbicide runoff and groundwater data from a pine-release herbicide application study near Gainesville, Florida were used to validate the GLEAMS model hydrology and pesticide component for forest application. The study revealed that model simulations agreed relatively well with the field data for the one-year study. Following validation, a modified version of...

  16. 40 CFR 86.1807-01 - Vehicle labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Conforms to U.S. EPA Regulations Applicable to XXX-Fueled 20XX Model Year New Motor Vehicles.” (B) For light-duty trucks, the statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX...: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-fueled 20XX Model Year New Medium-Duty...

  17. Inexpensive Laboratory Model with Many Applications.

    ERIC Educational Resources Information Center

    Archbold, Norbert L.; Johnson, Robert E.

    1987-01-01

    Presents a simple, inexpensive and realistic model which allows introductory geology students to obtain subsurface information through a simulated drilling experience. Offers ideas on additional applications to a variety of geologic situations. (ML)

  18. Forcing a three-dimensional, hydrostatic, primitive-equation model for application in the surf zone: 2. Application to DUCK94

    NASA Astrophysics Data System (ADS)

    Newberger, P. A.; Allen, J. S.

    2007-08-01

    A three-dimensional primitive-equation model for application to the nearshore surf zone has been developed. This model, an extension of the Princeton Ocean Model (POM), predicts the wave-averaged circulation forced by breaking waves. All of the features of the original POM are retained in the extended model so that applications can be made to regions where breaking waves, stratification, rotation, and wind stress make significant contributions to the flow behavior. In this study we examine the effects of breaking waves and wind stress. The nearshore POM circulation model is embedded within the NearCom community model and is coupled with a wave model. This combined modeling system is applied to the nearshore surf zone off Duck, North Carolina, during the DUCK94 field experiment of October 1994. Model results are compared to observations from this experiment, and the effects of parameter choices are examined. A process study examining the effects of tidal depth variation on depth-dependent wave-averaged currents is carried out. With identical offshore wave conditions and model parameters, the strength and spatial structure of the undertow and of the alongshore current vary systematically with water depth. Some three-dimensional solutions show the development of shear instabilities of the alongshore current. Inclusion of wave-current interactions makes an appreciable difference in the characteristics of the instability.

  19. Probabilistic arithmetic automata and their applications.

    PubMed

    Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven

    2012-01-01

    We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.

  20. Population modeling for pesticide risk assessment of threatened species-A case study of a terrestrial plant, Boltonia decurrens.

    PubMed

    Schmolke, Amelie; Brain, Richard; Thorbek, Pernille; Perkins, Daniel; Forbes, Valery

    2017-02-01

    Although population models are recognized as necessary tools in the ecological risk assessment of pesticides, particularly for species listed under the Endangered Species Act, their application in this context is currently limited to very few cases. The authors developed a detailed, individual-based population model for a threatened plant species, the decurrent false aster (Boltonia decurrens), for application in pesticide risk assessment. Floods and competition with other plant species are known factors that drive the species' population dynamics and were included in the model approach. The authors use the model to compare the population-level effects of 5 toxicity surrogates applied to B. decurrens under varying environmental conditions. The model results suggest that the environmental conditions under which herbicide applications occur may have a higher impact on populations than organism-level sensitivities to an herbicide within a realistic range. Indirect effects may be as important as the direct effects of herbicide applications by shifting competition strength if competing species have different sensitivities to the herbicide. The model approach provides a case study for population-level risk assessments of listed species. Population-level effects of herbicides can be assessed in a realistic and species-specific context, and uncertainties can be addressed explicitly. The authors discuss how their approach can inform the future development and application of modeling for population-level risk assessments of listed species, and ecological risk assessment in general. Environ Toxicol Chem 2017;36:480-491. © 2016 SETAC. © 2016 SETAC.

  1. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    PubMed

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  2. Feasibility of the "Bring Your Own Device" Model in Clinical Research: Results from a Randomized Controlled Pilot Study of a Mobile Patient Engagement Tool.

    PubMed

    Pugliese, Laura; Woodriff, Molly; Crowley, Olga; Lam, Vivian; Sohn, Jeremy; Bradley, Scott

    2016-03-16

    Rising rates of smartphone ownership highlight opportunities for improved mobile application usage in clinical trials. While current methods call for device provisioning, the "bring your own device" (BYOD) model permits participants to use personal phones allowing for improved patient engagement and lowered operational costs. However, more evidence is needed to demonstrate the BYOD model's feasibility in research settings. To assess if CentrosHealth, a mobile application designed to support trial compliance, produces different outcomes in medication adherence and application engagement when distributed through study-provisioned devices compared to the BYOD model. 87 participants were randomly selected to use the mobile application or no intervention for a 28-day pilot study at a 2:1 randomization ratio (2 intervention: 1 control) and asked to consume a twice-daily probiotic supplement. The application users were further randomized into two groups: receiving the application on a personal "BYOD" or study-provided smartphone. In-depth interviews were performed in a randomly-selected subset of the intervention group (five BYOD and five study-provided smartphone users). The BYOD subgroup showed significantly greater engagement than study-provided phone users, as shown by higher application use frequency and duration over the study period. The BYOD subgroup also demonstrated a significant effect of engagement on medication adherence for number of application sessions (unstandardized regression coefficient beta=0.0006, p=0.02) and time spent therein (beta=0.00001, p=0.03). Study-provided phone users showed higher initial adherence rates, but greater decline (5.7%) than BYOD users (0.9%) over the study period. In-depth interviews revealed that participants preferred the BYOD model over using study-provided devices. Results indicate that the BYOD model is feasible in health research settings and improves participant experience, calling for further BYOD model validity assessment. Although group differences in medication adherence decline were insignificant, the greater trend of decline in provisioned device users warrants further investigation to determine if trends reach significance over time. Significantly higher application engagement rates and effect of engagement on medication adherence in the BYOD subgroup similarly imply that greater application engagement may correlate to better medication adherence over time.

  3. Search algorithm complexity modeling with application to image alignment and matching

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2014-05-01

    Search algorithm complexity modeling, in the form of penetration rate estimation, provides a useful way to estimate search efficiency in application domains which involve searching over a hypothesis space of reference templates or models, as in model-based object recognition, automatic target recognition, and biometric recognition. The penetration rate quantifies the expected portion of the database that must be searched, and is useful for estimating search algorithm computational requirements. In this paper we perform mathematical modeling to derive general equations for penetration rate estimates that are applicable to a wide range of recognition problems. We extend previous penetration rate analyses to use more general probabilistic modeling assumptions. In particular we provide penetration rate equations within the framework of a model-based image alignment application domain in which a prioritized hierarchical grid search is used to rank subspace bins based on matching probability. We derive general equations, and provide special cases based on simplifying assumptions. We show how previously-derived penetration rate equations are special cases of the general formulation. We apply the analysis to model-based logo image alignment in which a hierarchical grid search is used over a geometric misalignment transform hypothesis space. We present numerical results validating the modeling assumptions and derived formulation.

  4. POMICS: A Simulation Disease Model for Timing Fungicide Applications in Management of Powdery Mildew of Cucurbits.

    PubMed

    Sapak, Z; Salam, M U; Minchinton, E J; MacManus, G P V; Joyce, D C; Galea, V J

    2017-09-01

    A weather-based simulation model, called Powdery Mildew of Cucurbits Simulation (POMICS), was constructed to predict fungicide application scheduling to manage powdery mildew of cucurbits. The model was developed on the principle that conditions favorable for Podosphaera xanthii, a causal pathogen of this crop disease, generate a number of infection cycles in a single growing season. The model consists of two components that (i) simulate the disease progression of P. xanthii in secondary infection cycles under natural conditions and (ii) predict the disease severity with application of fungicides at any recurrent disease cycles. The underlying environmental factors associated with P. xanthii infection were quantified from laboratory and field studies, and also gathered from literature. The performance of the POMICS model when validated with two datasets of uncontrolled natural infection was good (the mean difference between simulated and observed disease severity on a scale of 0 to 5 was 0.02 and 0.05). In simulations, POMICS was able to predict high- and low-risk disease alerts. Furthermore, the predicted disease severity was responsive to the number of fungicide applications. Such responsiveness indicates that the model has the potential to be used as a tool to guide the scheduling of judicious fungicide applications.

  5. [Review on HSPF model for simulation of hydrology and water quality processes].

    PubMed

    Li, Zhao-fu; Liu, Hong-Yu; Li, Yan

    2012-07-01

    Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.

  6. Compact modeling of CRS devices based on ECM cells for memory, logic and neuromorphic applications.

    PubMed

    Linn, E; Menzel, S; Ferch, S; Waser, R

    2013-09-27

    Dynamic physics-based models of resistive switching devices are of great interest for the realization of complex circuits required for memory, logic and neuromorphic applications. Here, we apply such a model of an electrochemical metallization (ECM) cell to complementary resistive switches (CRSs), which are favorable devices to realize ultra-dense passive crossbar arrays. Since a CRS consists of two resistive switching devices, it is straightforward to apply the dynamic ECM model for CRS simulation with MATLAB and SPICE, enabling study of the device behavior in terms of sweep rate and series resistance variations. Furthermore, typical memory access operations as well as basic implication logic operations can be analyzed, revealing requirements for proper spike and level read operations. This basic understanding facilitates applications of massively parallel computing paradigms required for neuromorphic applications.

  7. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  8. Fundamental principles of data assimilation underlying the Verdandi library: applications to biophysical model personalization within euHeart.

    PubMed

    Chapelle, D; Fragu, M; Mallet, V; Moireau, P

    2013-11-01

    We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.

  9. DEB modeling for nanotoxicology, microbial ecology, and environmental engineering. Comment on: ;Physics of metabolic organization; by Marko Jusup et al.

    NASA Astrophysics Data System (ADS)

    Holden, Patricia A.

    2017-03-01

    Jusup et al. [1] appeal to mathematical physicists, and to biologists, by providing the theoretical basis for dynamic energy budget (DEB) modeling of individual organisms and populations, while emphasizing model simplicity, universality, and applicability to real world problems. Comments herein regard the disciplinary tensions proposed by the authors and suggest that-in addition to important applications in eco- and specifically nano-toxicology-there are opportunities for DEB frameworks to inform relative complexity in microbial ecological process modeling. This commentary also suggests another audience for bridging DEB theory and application-engineers solving environmental problems.

  10. Determination of scattering functions and their effects on remote sensing of turbidity in natural waters

    NASA Technical Reports Server (NTRS)

    Ghovanlou, A. H.; Gupta, J. N.; Henderson, R. G.

    1977-01-01

    The development of quantitative analytical procedures for relating scattered signals, measured by a remote sensor, was considered. The applications of a Monte Carlo simulation model for radiative transfer in turbid water are discussed. The model is designed to calculate the characteristics of the backscattered signal from an illuminated body of water as a function of the turbidity level, and the spectral properties of the suspended particulates. The optical properties of the environmental waters, necessary for model applications, were derived from available experimental data and/or calculated from Mie formalism. Results of applications of the model are presented.

  11. Application of a data base management system to a finite element model

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1980-01-01

    In today's software market, much effort is being expended on the development of data base management systems (DBMS). Most commercially available DBMS were designed for business use. However, the need for such systems within the engineering and scientific communities is becoming apparent. A potential DBMS application that appears attractive is the handling of data for finite element engineering models. The applications of a commercially available, business-oriented DBMS to a structural engineering, finite element model is explored. The model, DBMS, an approach to using the DBMS, advantages and disadvantages are described. Plans for research on a scientific and engineering DBMS are discussed.

  12. A two-way nesting procedure for the WAM model: Application to the Spanish coast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lahoz, M.G.; Albiach, J.C.C.

    1997-02-01

    The performance of the standard one-way nesting procedure for a regional application of a third-generation wave model is investigated. It is found that this nesting procedure is not applicable when the resolution has to be enhanced drastically, unless intermediate grids are placed between the coarse and the fine grid areas. This solution, in turn, requires an excess of computing resources. A two-way nesting procedure is developed and implemented in the model. Advantages and disadvantages of both systems are discussed. The model output for a test case is compared with observed data and the results are discussed in the paper.

  13. Comparison between beta radiation dose distribution due to LDR and HDR ocular brachytherapy applicators using GATE Monte Carlo platform.

    PubMed

    Mostafa, Laoues; Rachid, Khelifi; Ahmed, Sidi Moussa

    2016-08-01

    Eye applicators with 90Sr/90Y and 106Ru/106Rh beta-ray sources are generally used in brachytherapy for the treatment of eye diseases as uveal melanoma. Whenever, radiation is used in treatment, dosimetry is essential. However, knowledge of the exact dose distribution is a critical decision-making to the outcome of the treatment. The Monte Carlo technique provides a powerful tool for calculation of the dose and dose distributions which helps to predict and determine the doses from different shapes of various types of eye applicators more accurately. The aim of this work consisted in using the Monte Carlo GATE platform to calculate the 3D dose distribution on a mathematical model of the human eye according to international recommendations. Mathematical models were developed for four ophthalmic applicators, two HDR 90Sr applicators SIA.20 and SIA.6, and two LDR 106Ru applicators, a concave CCB model and a flat CCB model. In present work, considering a heterogeneous eye phantom and the chosen tumor, obtained results with the use of GATE for mean doses distributions in a phantom and according to international recommendations show a discrepancy with respect to those specified by the manufacturers. The QC of dosimetric parameters shows that contrarily to the other applicators, the SIA.20 applicator is consistent with recommendations. The GATE platform show that the SIA.20 applicator present better results, namely the dose delivered to critical structures were lower compared to those obtained for the other applicators, and the SIA.6 applicator, simulated with MCNPX generates higher lens doses than those generated by GATE. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Flow of Funds Modeling for Localized Financial Markets: An Application of Spatial Price and Allocation Activity Analysis Models.

    DTIC Science & Technology

    1981-01-01

    on modeling the managerial aspects of the firm. The second has been the application to economic theory led by ...individual portfolio optimization problems which were embedded in a larger global optimization problem. In the global problem, portfolios were linked by market ...demand quantities or be given by linear demand relationships. As in~ the source markets , the model

  15. 40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...

  16. 40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...

  17. Hearing Protection for High-Noise Environments. Part 1

    DTIC Science & Technology

    2007-05-31

    22 3.5.1 Properties of biological tissues ..... ............. 22 3.5.2 Elastic vs. acoustic modeling of tissues ............ 23...3.5.3 Range of applicability of acoustic modeling of tissues . 25 A Integral equations in acoustics 27 B Discretization of integral equations in...elasticity modeling We conclude the review of our Phase I results with a discussion on the range of applicability of acoustic modeling of biological

  18. Testing woody fuel consumption models for application in Australian southern eucalypt forest fires

    Treesearch

    J.J. Hollis; S. Matthews; Roger Ottmar; S.J. Prichard; S. Slijepcevic; N.D. Burrows; B. Ward; K.G. Tolhurst; W.R. Anderson; J S. Gould

    2010-01-01

    Five models for the consumption of coarse woody debris or woody fuels with a diameter larger than 0.6 cm were assessed for application in Australian southern eucalypt forest fires including: CONSUME models for (1) activity fuels, (2) natural western woody and (3) natural southern woody fuels, (4) the BURNUP model and (5) the recommendation by the Australian National...

  19. A method for independent modelling in support of regulatory review of dose assessments.

    PubMed

    Dverstorp, Björn; Xu, Shulan

    2017-11-01

    Several countries consider geological disposal facilities as the preferred option for spent nuclear fuel due to their potential to provide isolation from the surface environment on very long timescales. In 2011 the Swedish Nuclear Fuel & Waste Management Co. (SKB) submitted a license application for construction of a spent nuclear fuel repository. The disposal method involves disposing spent fuel in copper canisters with a cast iron insert at about 500 m depth in crystalline basement rock, and each canister is surrounded by a buffer of swelling bentonite clay. SKB's license application is supported by a post-closure safety assessment, SR-Site. SR-Site has been reviewed by the Swedish Radiation Safety Authority (SSM) for five years. The main method for review of SKB's license application is document review, which is carried out by SSM's staff and supported by SSM's external experts. The review has proven a challenging task due to its broad scope, complexity and multidisciplinary nature. SSM and its predecessors have, for several decades, been developing independent models to support regulatory reviews of post-closure safety assessments for geological repositories. For the review of SR-Site, SSM has developed a modelling approach with a structured application of independent modelling activities, including replication modelling, use of alternative conceptual models and bounding calculations, to complement the traditional document review. This paper describes this scheme and its application to biosphere and dose assessment modelling. SSM's independent modelling has provided important insights regarding quality and reasonableness of SKB's rather complex biosphere modelling and has helped quantifying conservatisms and highlighting conceptual uncertainty. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Estimating the agricultural fertilizer NH3 emission in China based on the bi-directional CMAQ model and an agro-ecosystem model

    NASA Astrophysics Data System (ADS)

    Wang, S.

    2014-12-01

    Atmospheric ammonia (NH3) plays an important role in fine particle formation. Accurate estimates of ammonia can reduce uncertainties in air quality modeling. China is one of the largest countries emitting ammonia with the majority of NH3 emissions coming from the agricultural practices, such as fertilizer applications and animal operations. The current ammonia emission estimates in China are mainly based on pre-defined emission factors. Thus, there are considerable uncertainties in estimating NH3 emissions, especially in time and space distribution. For example, fertilizer applications vary in the date of application and amount by geographical regions and crop types. In this study, the NH3 emission from the agricultural fertilizer use in China of 2011 was estimated online by an agricultural fertilizer modeling system coupling a regional air-quality model and an agro-ecosystem model, which contains three main components 1) the Environmental Policy Integrated Climate (EPIC) model, 2) the meso-scale meteorology Weather Research and Forecasting (WRF) model and 3) the CMAQ air quality model with bi-directional ammonia fluxes. The EPIC output information about daily fertilizer application and soil characteristics would be the input of the CMAQ model. In order to run EPIC model, much Chinese local information is collected and processed. For example, Crop land data are computed from the MODIS land use data at 500-m resolution and crop categories at Chinese county level; the fertilizer use rate for different fertilizer types, crops and provinces are obtained from Chinese statistic materials. The system takes into consideration many influencing factors on agriculture ammonia emission, including weather, the fertilizer application method, timing, amount, and rate for specific pastures and crops. The simulated fertilizer data is compared with the NH3 emissions and fertilizer application data from other sources. The results of CMAQ modeling are also discussed and analyzed with field measurements. The estimated agricultural fertilizer NH3 emission in this study is about 3Tg in 2011. The regions with the highest emission rates are located in the North China Plain. Monthly, the peak ammonia emissions occur in April to July.

  1. Life cycle modelling of environmental impacts of application of processed organic municipal solid waste on agricultural land (EASEWASTE).

    PubMed

    Hansen, Trine Lund; Bhander, Gurbakhash S; Christensen, Thomas Højlund; Bruun, Sander; Jensen, Lars Stoumann

    2006-04-01

    A model capable of quantifying the potential environmental impacts of agricultural application of composted or anaerobically digested source-separated organic municipal solid waste (MSW) is presented. In addition to the direct impacts, the model accounts for savings by avoiding the production and use of commercial fertilizers. The model is part of a larger model, Environmental Assessment of Solid Waste Systems and Technology (EASEWASTE), developed as a decision-support model, focusing on assessment of alternative waste management options. The environmental impacts of the land application of processed organic waste are quantified by emission coefficients referring to the composition of the processed waste and related to specific crop rotation as well as soil type. The model contains several default parameters based on literature data, field experiments and modelling by the agro-ecosystem model, Daisy. All data can be modified by the user allowing application of the model to other situations. A case study including four scenarios was performed to illustrate the use of the model. One tonne of nitrogen in composted and anaerobically digested MSW was applied as fertilizer to loamy and sandy soil at a plant farm in western Denmark. Application of the processed organic waste mainly affected the environmental impact categories global warming (0.4-0.7 PE), acidification (-0.06 (saving)-1.6 PE), nutrient enrichment (-1.0 (saving)-3.1 PE), and toxicity. The main contributors to these categories were nitrous oxide formation (global warming), ammonia volatilization (acidification and nutrient enrichment), nitrate losses (nutrient enrichment and groundwater contamination), and heavy metal input to soil (toxicity potentials). The local agricultural conditions as well as the composition of the processed MSW showed large influence on the environmental impacts. A range of benefits, mainly related to improved soil quality from long-term application of the processed organic waste, could not be generally quantified with respect to the chosen life cycle assessment impact categories and were therefore not included in the model. These effects should be considered in conjunction with the results of the life cycle assessment.

  2. Requirements analysis, domain knowledge, and design

    NASA Technical Reports Server (NTRS)

    Potts, Colin

    1988-01-01

    Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.

  3. Three-dimensional thermal finite element modeling of lithium-ion battery in thermal abuse application

    NASA Astrophysics Data System (ADS)

    Guo, Guifang; Long, Bo; Cheng, Bo; Zhou, Shiqiong; Xu, Peng; Cao, Binggang

    In order to better understand the thermal abuse behavior of high capacities and large power lithium-ion batteries for electric vehicle application, a three-dimensional thermal model has been developed for analyzing the temperature distribution under abuse conditions. The model takes into account the effects of heat generation, internal conduction and convection, and external heat dissipation to predict the temperature distribution in a battery. Three-dimensional model also considers the geometrical features to simulate oven test, which are significant in larger cells for electric vehicle application. The model predictions are compared to oven test results for VLP 50/62/100S-Fe (3.2 V/55 Ah) LiFePO 4/graphite cells and shown to be in great agreement.

  4. Wyoming greater sage-grouse habitat prioritization: A collection of multi-scale seasonal models and geographic information systems land management tools

    USGS Publications Warehouse

    O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.

    2015-01-01

    We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.

  5. Remote Sensing Sensors and Applications in Environmental Resources Mapping and Modelling

    PubMed Central

    Melesse, Assefa M.; Weng, Qihao; S.Thenkabail, Prasad; Senay, Gabriel B.

    2007-01-01

    The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling. PMID:28903290

  6. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel.

    PubMed

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed.

  7. A Bridging Opportunities Work-frame to develop mobile applications for clinical decision making

    PubMed Central

    van Rooij, Tibor; Rix, Serena; Moore, James B; Marsh, Sharon

    2015-01-01

    Background: Mobile applications (apps) providing clinical decision support (CDS) may show the greatest promise when created by and for frontline clinicians. Our aim was to create a generic model enabling healthcare providers to direct the development of CDS apps. Methods: We combined Change Management with a three-tier information technology architecture to stimulate CDS app development. Results: A Bridging Opportunities Work-frame model was developed. A test case was used to successfully develop an app. Conclusion: Healthcare providers can re-use this globally applicable model to actively create and manage regional decision support applications to translate evidence-based medicine in the use of emerging medication or novel treatment regimens. PMID:28031883

  8. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel

    PubMed Central

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G.; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed. PMID:27252672

  9. Context-aware workflow management of mobile health applications.

    PubMed

    Salden, Alfons; Poortinga, Remco

    2006-01-01

    We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.

  10. SU-D-19A-05: The Dosimetric Impact of Using Xoft Axxent® Electronic Brachytherapy Source TG-43 Dosimetry Parameters for Treatment with the Xoft 30 Mm Diameter Vaginal Applicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simiele, S; Micka, J; Culberson, W

    2014-06-01

    Purpose: A full TG-43 dosimetric characterization has not been performed for the Xoft Axxent ® electronic brachytherapy source (Xoft, a subsidiary of iCAD, San Jose, CA) within the Xoft 30 mm diameter vaginal applicator. Currently, dose calculations are performed using the bare-source TG-43 parameters and do not account for the presence of the applicator. This work focuses on determining the difference between the bare-source and sourcein- applicator TG-43 parameters. Both the radial dose function (RDF) and polar anisotropy function (PAF) were computationally determined for the source-in-applicator and bare-source models to determine the impact of using the bare-source dosimetry data. Methods:more » MCNP5 was used to model the source and the Xoft 30 mm diameter vaginal applicator. All simulations were performed using 0.84p and 0.03e cross section libraries. All models were developed based on specifications provided by Xoft. The applicator is made of a proprietary polymer material and simulations were performed using the most conservative chemical composition. An F6 collision-kerma tally was used to determine the RDF and PAF values in water at various dwell positions. The RDF values were normalized to 2.0 cm from the source to accommodate the applicator radius. Source-in-applicator results were compared with bare-source results from this work as well as published baresource results. Results: For a 0 mm source pullback distance, the updated bare-source model and source-in-applicator RDF values differ by 2% at 3 cm and 4% at 5 cm. The largest PAF disagreements were observed at the distal end of the source and applicator with up to 17% disagreement at 2 cm and 8% at 8 cm. The bare-source model had RDF values within 2.6% of the published TG-43 data and PAF results within 7.2% at 2 cm. Conclusion: Results indicate that notable differences exist between the bare-source and source-in-applicator TG-43 simulated parameters. Xoft Inc. provided partial funding for this work.« less

  11. Performance Prediction Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chennupati, Gopinath; Santhi, Nanadakishore; Eidenbenz, Stephen

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes,more » cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few small test problems using hardware counters; also, hard-coded hit-rates make the hardware model insensitive to changes in cache sizes. Alternatively, we use reuse distance distributions in the tasklists. In general, reuse profiles require the application modeler to run a very expensive trace analysis on the real code that realistically can be done at best for small examples.« less

  12. Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application

    NASA Astrophysics Data System (ADS)

    Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei

    2016-04-01

    In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.

  13. Methodology and application of combined watershed and ground-water models in Kansas

    USGS Publications Warehouse

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling system much easier. This approach also enhances model calibration and thus the reliability of model results. (C) 2000 Elsevier Science B.V.Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and ve

  14. Families with Noncompliant Children: Applications of the Systemic Model.

    ERIC Educational Resources Information Center

    Neilans, Thomas H.; And Others

    This paper describes the application of a systems approach model to assessing families with a labeled noncompliant child. The first section describes and comments on the applied methodology for the model. The second section describes the classification of 61 families containing a child labeled by the family as noncompliant. An analysis of data…

  15. SWAT Check: A screening tool to assist users in the identification of potential model application problems

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) is a basin scale hydrologic model developed by the US Department of Agriculture-Agricultural Research Service. SWAT's broad applicability, user friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new u...

  16. Model My Watershed: Connecting Students' Conceptual Understanding of Watersheds to Real-World Decision Making

    ERIC Educational Resources Information Center

    Gill, Susan E.; Marcum-Dietrich, Nanette; Becker-Klein, Rachel

    2014-01-01

    The Model My Watershed (MMW) application, and associated curricula, provides students with meaningful opportunities to connect conceptual understanding of watersheds to real-world decision making. The application uses an authentic hydrologic model, TR-55 (developed by the U.S. Natural Resources Conservation Service), and real data applied in…

  17. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  18. The Use of Constructive Modeling and Virtual Simulation in Large-Scale Team Training: A Military Case Study.

    ERIC Educational Resources Information Center

    Andrews, Dee H.; Dineen, Toni; Bell, Herbert H.

    1999-01-01

    Discusses the use of constructive modeling and virtual simulation in team training; describes a military application of constructive modeling, including technology issues and communication protocols; considers possible improvements; and discusses applications in team-learning environments other than military, including industry and education. (LRW)

  19. 77 FR 34359 - Applications for New Awards: Disability and Rehabilitation Research Projects and Centers Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... Projects and Centers Program; Traumatic Brain Injury Model Systems Centers AGENCY: Office of Special... Brain Injury Model Systems Centers (TBIMS). Notice inviting applications for new awards for fiscal year... 28, 2006 (71 FR 25472). The Traumatic Brain Injury Model Systems Centers priority is from the notice...

  20. Mixture Distribution Latent State-Trait Analysis: Basic Ideas and Applications

    ERIC Educational Resources Information Center

    Courvoisier, Delphine S.; Eid, Michael; Nussbeck, Fridtjof W.

    2007-01-01

    Extensions of latent state-trait models for continuous observed variables to mixture latent state-trait models with and without covariates of change are presented that can separate individuals differing in their occasion-specific variability. An empirical application to the repeated measurement of mood states (N = 501) revealed that a model with 2…

  1. Atmospheric Models for Aeroentry and Aeroassist

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2004-01-01

    Eight destinations in the Solar System have sufficient atmosphere for aeroentry, aeroassist, or aerobraking/aerocapture: Venus, Earth, Mars, Jupiter, Saturn, Uranus, and Neptune, plus Saturn's moon Titan. Engineering-level atmospheric models for Earth, Mars, Titan, and Neptune have been developed for use in NASA s systems analysis studies of aerocapture applications. Development has begun on a similar atmospheric model for Venus. An important capability of these models is simulation of quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Characteristics of these atmospheric models are compared, and example applications for aerocapture are presented. Recent Titan atmospheric model updates are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan. Recent and planned updates to the Mars atmospheric model, in support of future Mars aerocapture systems analysis studies, are also presented.

  2. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    PubMed

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  4. Generating Performance Models for Irregular Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scalingmore » when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.« less

  5. Plant growth and architectural modelling and its applications

    PubMed Central

    Guo, Yan; Fourcaud, Thierry; Jaeger, Marc; Zhang, Xiaopeng; Li, Baoguo

    2011-01-01

    Over the last decade, a growing number of scientists around the world have invested in research on plant growth and architectural modelling and applications (often abbreviated to plant modelling and applications, PMA). By combining physical and biological processes, spatially explicit models have shown their ability to help in understanding plant–environment interactions. This Special Issue on plant growth modelling presents new information within this topic, which are summarized in this preface. Research results for a variety of plant species growing in the field, in greenhouses and in natural environments are presented. Various models and simulation platforms are developed in this field of research, opening new features to a wider community of researchers and end users. New modelling technologies relating to the structure and function of plant shoots and root systems are explored from the cellular to the whole-plant and plant-community levels. PMID:21638797

  6. Prediction of 1-octanol solubilities using data from the Open Notebook Science Challenge.

    PubMed

    Buonaiuto, Michael A; Lang, Andrew S I D

    2015-12-01

    1-Octanol solubility is important in a variety of applications involving pharmacology and environmental chemistry. Current models are linear in nature and often require foreknowledge of either melting point or aqueous solubility. Here we extend the range of applicability of 1-octanol solubility models by creating a random forest model that can predict 1-octanol solubilities directly from structure. We created a random forest model using CDK descriptors that has an out-of-bag (OOB) R 2 value of 0.66 and an OOB mean squared error of 0.34. The model has been deployed for general use as a Shiny application. The 1-octanol solubility model provides reasonably accurate predictions of the 1-octanol solubility of organic solutes directly from structure. The model was developed under Open Notebook Science conditions which makes it open, reproducible, and as useful as possible.Graphical abstract.

  7. Assessment of IT solutions used in the Hungarian income tax microsimulation system

    NASA Astrophysics Data System (ADS)

    Molnar, I.; Hardhienata, S.

    2017-01-01

    This paper focuses on the use of information technology (IT) in diverse microsimulation studies and presents state-of-the-art solutions in the traditional application field of personal income tax simulation. The aim of the paper is to promote solutions, which can improve the efficiency and quality of microsimulation model implementation, assess their applicability and help to shift attention from microsimulation model implementation and data analysis towards experiment design and model use. First, the authors shortly discuss the relevant characteristics of the microsimulation application field and the managerial decision-making problem. After examination of the salient problems, advanced IT solutions, such as meta-database and service-oriented architecture are presented. The authors show how selected technologies can be applied to support both data- and behavior-driven and even agent-based personal income tax microsimulation model development. Finally, examples are presented and references made to the Hungarian Income Tax Simulator (HITS) models and their results. The paper concludes with a summary of the IT assessment and application-related author remarks dedicated to an Indonesian Income Tax Microsimulation Model.

  8. Applicability of Kinematic and Diffusive models for mud-flows: a steady state analysis

    NASA Astrophysics Data System (ADS)

    Di Cristo, Cristiana; Iervolino, Michele; Vacca, Andrea

    2018-04-01

    The paper investigates the applicability of Kinematic and Diffusive Wave models for mud-flows with a power-law shear-thinning rheology. In analogy with a well-known approach for turbulent clear-water flows, the study compares the steady flow depth profiles predicted by approximated models with those of the Full Dynamic Wave one. For all the models and assuming an infinitely wide channel, the analytical solution of the flow depth profiles, in terms of hypergeometric functions, is derived. The accuracy of the approximated models is assessed by computing the average, along the channel length, of the errors, for several values of the Froude and kinematic wave numbers. Assuming the threshold value of the error equal to 5%, the applicability conditions of the two approximations have been individuated for several values of the power-law exponent, showing a crucial role of the rheology. The comparison with the clear-water results indicates that applicability criteria for clear-water flows do not apply to shear-thinning fluids, potentially leading to an incorrect use of approximated models if the rheology is not properly accounted for.

  9. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  10. Semantic Web Infrastructure Supporting NextFrAMES Modeling Platform

    NASA Astrophysics Data System (ADS)

    Lakhankar, T.; Fekete, B. M.; Vörösmarty, C. J.

    2008-12-01

    Emerging modeling frameworks offer new ways to modelers to develop model applications by offering a wide range of software components to handle common modeling tasks such as managing space and time, distributing computational tasks in parallel processing environment, performing input/output and providing diagnostic facilities. NextFrAMES, the next generation updates to the Framework for Aquatic Modeling of the Earth System originally developed at University of New Hampshire and currently hosted at The City College of New York takes a step further by hiding most of these services from modeler behind a platform agnostic modeling platform that allows scientists to focus on the implementation of scientific concepts in the form of a new modeling markup language and through a minimalist application programming interface that provide means to implement model processes. At the core of the NextFrAMES modeling platform there is a run-time engine that interprets the modeling markup language loads the module plugins establishes the model I/O and executes the model defined by the modeling XML and the accompanying plugins. The current implementation of the run-time engine is designed for single processor or symmetric multi processing (SMP) systems but future implementation of the run-time engine optimized for different hardware architectures are anticipated. The modeling XML and the accompanying plugins define the model structure and the computational processes in a highly abstract manner, which is not only suitable for the run-time engine, but has the potential to integrate into semantic web infrastructure, where intelligent parsers can extract information about the model configurations such as input/output requirements applicable space and time scales and underlying modeling processes. The NextFrAMES run-time engine itself is also designed to tap into web enabled data services directly, therefore it can be incorporated into complex workflow to implement End-to-End application from observation to the delivery of highly aggregated information. Our presentation will discuss the web services ranging from OpenDAP and WaterOneFlow data services to metadata provided through catalog services that could serve NextFrAMES modeling applications. We will also discuss the support infrastructure needed to streamline the integration of NextFrAMES into an End-to-End application to deliver highly processed information to end users. The End-to-End application will be demonstrated through examples from the State-of-the Global Water System effort that builds on data services provided through WMO's Global Terrestrial Network for Hydrology to deliver water resources related information to policy makers for better water management. Key components of this E2E system are promoted as Community of Practice examples for the Global Observing System of Systems therefore the State-of-the Global Water System can be viewed as test case for the interoperability of the incorporated web service components.

  11. Evaluation of DeNitrification DeComposition Model to Estimate Ammonia Fluxes from Chemical Fertilizer Application

    NASA Astrophysics Data System (ADS)

    Balasubramanian, S.; Nelson, A. J.; Koloutsou-Vakakis, S.; Lin, J.; Myles, L.; Rood, M. J.

    2016-12-01

    Biogeochemical models such as DeNitrification DeComposition (DNDC) are used to model greenhouse and other trace gas fluxes (e.g., ammonia (NH3)) from agricultural ecosystems. NH3 is of interest to air quality because it is a precursor to ambient particulate matter. NH3 fluxes from chemical fertilizer application are uncertain due to dependence on local weather and soil properties, and farm nitrogen management practices. DNDC can be advantageously implemented to model the underlying spatial and temporal trends to support air quality modeling. However, such implementation, requires a detailed evaluation of model predictions, and model behavior. This is the first study to assess DNDC predictions of NH3 fluxes to/from the atmosphere, from chemical fertilizer application, during an entire crop growing season, in the United States. Relaxed eddy accumulation (REA) measurements over corn in Central Illinois, in year 2014, were used to evaluate magnitude and trends in modeled NH3 fluxes. DNDC was able to replicate both magnitude and trends in measured NH3 fluxes, with greater accuracy during the initial 33 days after application, when NH3 was mostly emitted to the atmosphere. However, poorer performance was observed when depositional fluxes were measured. Sensitivity analysis using Monte Carlo simulations indicated that modeled NH3 fluxes were most sensitive to input air temperature and precipitation, soil organic carbon, field capacity and pH and fertilizer loading rate, timing, and application depth and tilling date. By constraining these inputs for conditions in Central Illinois, uncertainty in annual NH3 fluxes was estimated to vary from -87% to 61%. Results from this study provides insight to further improve DNDC predictions and inform efforts for upscaling site predictions to regional scale for the development of emission inventories for air quality modeling.

  12. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  13. A new service-oriented grid-based method for AIoT application and implementation

    NASA Astrophysics Data System (ADS)

    Zou, Yiqin; Quan, Li

    2017-07-01

    The traditional three-layer Internet of things (IoT) model, which includes physical perception layer, information transferring layer and service application layer, cannot express complexity and diversity in agricultural engineering area completely. It is hard to categorize, organize and manage the agricultural things with these three layers. Based on the above requirements, we propose a new service-oriented grid-based method to set up and build the agricultural IoT. Considering the heterogeneous, limitation, transparency and leveling attributes of agricultural things, we propose an abstract model for all agricultural resources. This model is service-oriented and expressed with Open Grid Services Architecture (OGSA). Information and data of agricultural things were described and encapsulated by using XML in this model. Every agricultural engineering application will provide service by enabling one application node in this service-oriented grid. Description of Web Service Resource Framework (WSRF)-based Agricultural Internet of Things (AIoT) and the encapsulation method were also discussed in this paper for resource management in this model.

  14. Planning for Downtown Circulation Systems. Volume 3. Appendices.

    DOT National Transportation Integrated Search

    1983-10-01

    This volume contains worksheets for estimating circulator patronage, costs, revenues and travel impacts, detailed discussions of estimation and application procedures for the demand models developed, and a case study of the models' application using ...

  15. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  16. Numerical model (switchable/dual model) of the human head for rigid body and finite elements applications.

    PubMed

    Tabacu, Stefan

    2015-01-01

    In this paper, a methodology for the development and validation of a numerical model of the human head using generic procedures is presented. All steps required, starting with the model generation, model validation and applications will be discussed. The proposed model may be considered as a dual one due to its capabilities to switch from deformable to a rigid body according to the application's requirements. The first step is to generate the numerical model of the human head using geometry files or medical images. The required stiffness and damping for the elastic connection used for the rigid body model are identified by performing a natural frequency analysis. The presented applications for model validation are related to impact analysis. The first case is related to Nahum's (Nahum and Smith 1970) experiments pressure data being evaluated and a pressure map generated using the results from discrete elements. For the second case, the relative displacement between the brain and the skull is evaluated according to Hardy's (Hardy WH, Foster CD, Mason, MJ, Yang KH, King A, Tashman S. 2001.Investigation of head injury mechanisms using neutral density technology and high-speed biplanar X-ray. Stapp Car Crash J. 45:337-368, SAE Paper 2001-22-0016) experiments. The main objective is to validate the rigid model as a quick and versatile tool for acquiring the input data for specific brain analyses.

  17. An Open Source Simulation Model for Soil and Sediment Bioturbation

    PubMed Central

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997

  18. An open source simulation model for soil and sediment bioturbation.

    PubMed

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.

  19. Update: Validation, Edits, and Application Processing. Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    An update to the Validation, Edits, and Application Processing and Error-Prone Model Report (Section 1, July 3, 1980) is presented. The objective is to present the most current data obtained from the June 1980 Basic Educational Opportunity Grant applicant and recipient files and to determine whether the findings reported in Section 1 of the July…

  20. New agrophysics divisions: application of ANFIS, fuzzy indicator modeling, physic-technical bases of plant breeding, and materials based on humic acids (review)

    USDA-ARS?s Scientific Manuscript database

    This work is devoted to review the new scientific divisions that emerged in agrophysics in the last 10-15 years. Among them are the following: 1) application of Adaptive Neuro-Fuzzy Inference System (ANFIS), 2) development and application of fuzzy indicator modeling, 3) agrophysical and physic-tech...

  1. Description and availability of the SMARTS spectral model for photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Myers, Daryl R.; Gueymard, Christian A.

    2004-11-01

    Limited spectral response range of photocoltaic (PV) devices requires device performance be characterized with respect to widely varying terrestrial solar spectra. The FORTRAN code "Simple Model for Atmospheric Transmission of Sunshine" (SMARTS) was developed for various clear-sky solar renewable energy applications. The model is partly based on parameterizations of transmittance functions in the MODTRAN/LOWTRAN band model family of radiative transfer codes. SMARTS computes spectra with a resolution of 0.5 nanometers (nm) below 400 nm, 1.0 nm from 400 nm to 1700 nm, and 5 nm from 1700 nm to 4000 nm. Fewer than 20 input parameters are required to compute spectral irradiance distributions including spectral direct beam, total, and diffuse hemispherical radiation, and up to 30 other spectral parameters. A spreadsheet-based graphical user interface can be used to simplify the construction of input files for the model. The model is the basis for new terrestrial reference spectra developed by the American Society for Testing and Materials (ASTM) for photovoltaic and materials degradation applications. We describe the model accuracy, functionality, and the availability of source and executable code. Applications to PV rating and efficiency and the combined effects of spectral selectivity and varying atmospheric conditions are briefly discussed.

  2. Application of System Operational Effectiveness Methodology to Space Launch Vehicle Development and Operations

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Kelley, Gary W.

    2012-01-01

    The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.

  3. W3MAMCAT: a world wide web based tool for mammillary and catenary compartmental modeling and expert system distinguishability.

    PubMed

    Russell, Solomon; Distefano, Joseph J

    2006-07-01

    W(3)MAMCAT is a new web-based and interactive system for building and quantifying the parameters or parameter ranges of n-compartment mammillary and catenary model structures, with input and output in the first compartment, from unstructured multiexponential (sum-of-n-exponentials) models. It handles unidentifiable as well as identifiable models and, as such, provides finite parameter interval solutions for unidentifiable models, whereas direct parameter search programs typically do not. It also tutorially develops the theory of model distinguishability for same order mammillary versus catenary models, as did its desktop application predecessor MAMCAT+. This includes expert system analysis for distinguishing mammillary from catenary structures, given input and output in similarly numbered compartments. W(3)MAMCAT provides for universal deployment via the internet and enhanced application error checking. It uses supported Microsoft technologies to form an extensible application framework for maintaining a stable and easily updatable application. Most important, anybody, anywhere, is welcome to access it using Internet Explorer 6.0 over the internet for their teaching or research needs. It is available on the Biocybernetics Laboratory website at UCLA: www.biocyb.cs.ucla.edu.

  4. The Power Prior: Theory and Applications

    PubMed Central

    Ibrahim, Joseph G.; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang

    2015-01-01

    The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A to Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Prequentist properties of power priors in posterior inference are established and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. PMID:26346180

  5. Nonlinear Constitutive Relations for High Temperature Application, 1984

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Nonlinear constitutive relations for high temperature applications were discussed. The state of the art in nonlinear constitutive modeling of high temperature materials was reviewed and the need for future research and development efforts in this area was identified. Considerable research efforts are urgently needed in the development of nonlinear constitutive relations for high temperature applications prompted by recent advances in high temperature materials technology and new demands on material and component performance. Topics discussed include: constitutive modeling, numerical methods, material testing, and structural applications.

  6. A Comparative Study of Multi-material Data Structures for Computational Physics Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Rao Veerabhadra; Robey, Robert W.

    The data structures used to represent the multi-material state of a computational physics application can have a drastic impact on the performance of the application. We look at efficient data structures for sparse applications where there may be many materials, but only one or few in most computational cells. We develop simple performance models for use in selecting possible data structures and programming patterns. We verify the analytic models of performance through a small test program of the representative cases.

  7. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  8. Numerical study of turbulence-influence mechanism on arc characteristics in an air direct current circuit breaker

    NASA Astrophysics Data System (ADS)

    Wu, Mingliang; Yang, Fei; Rong, Mingzhe; Wu, Yi; Qi, Yang; Cui, Yufei; Liu, Zirui; Guo, Anxiang

    2016-04-01

    This paper focuses on the numerical investigation of arc characteristics in an air direct current circuit breaker (air DCCB). Using magneto-hydrodynamics (MHD) theory, 3D laminar model and turbulence model are constructed and calculated. The standard k-epsilon model is utilized to consider the turbulence effect in the arc chamber of the DCCB. Several important phenomena are found: the arc column in the turbulence-model case is more extensive, moves much more slowly than the counterpart in the laminar-model case, and shows stagnation at the entrance of the chamber, unlike in the laminar-model case. Moreover, the arc voltage in the turbulence-model case is much lower than in the laminar-model case. However, the results in the turbulence-model case show a much better agreement with the results of the breaking experiments under DC condition than in the laminar-model case, which is contradictory to the previous conclusions from the arc researches of both the low-voltage circuit breaker and the sulfur hexafluoride (SF6) nozzle. First, in the previous air-arc research of the low-voltage circuit breaker, it is assumed that the air plasma inside the chamber is in the state of laminar, and the laminar-model application gives quite satisfactory results compared with the experiments, while in this paper, the laminar-model application works badly. Second, the turbulence-model application in the arc research of the SF6-nozzle performs much better and gives higher arc voltage than the laminar-model application does, whereas in this paper, the turbulence-model application predicts lower arc voltage than the laminar-model application does. Based on the analysis of simulation results in detail, the mechanism of the above phenomena is revealed. The transport coefficients are strongly changed by turbulence, which will enhance the arc diffusion and make the arc volume much larger. Consequently, the arc appearance and the distribution of Lorentz force in the turbulence-model case substantially differ from the arc appearance and the distribution of Lorentz force in the laminar-model case. Thus, the moving process of the arc in the turbulence-model case is slowed down and slower than in the laminar-model case. Moreover, the more extensive arc column in the turbulence-model case reduces the total arc resistance, which results in a lower arc voltage, more consistent with the experimental results than the arc voltage in the laminar-model case. Therefore, the air plasma inside this air DCCB is believed to be in the turbulence state, and the turbulence model is more suitable than the laminar model for the arc simulation of this kind of air DCCB.

  9. Numerical study of turbulence-influence mechanism on arc characteristics in an air direct current circuit breaker

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Mingliang; Yang, Fei, E-mail: yfei2007@mail.xjtu.edu.cn; Rong, Mingzhe

    This paper focuses on the numerical investigation of arc characteristics in an air direct current circuit breaker (air DCCB). Using magneto-hydrodynamics (MHD) theory, 3D laminar model and turbulence model are constructed and calculated. The standard k-epsilon model is utilized to consider the turbulence effect in the arc chamber of the DCCB. Several important phenomena are found: the arc column in the turbulence-model case is more extensive, moves much more slowly than the counterpart in the laminar-model case, and shows stagnation at the entrance of the chamber, unlike in the laminar-model case. Moreover, the arc voltage in the turbulence-model case ismore » much lower than in the laminar-model case. However, the results in the turbulence-model case show a much better agreement with the results of the breaking experiments under DC condition than in the laminar-model case, which is contradictory to the previous conclusions from the arc researches of both the low-voltage circuit breaker and the sulfur hexafluoride (SF6) nozzle. First, in the previous air-arc research of the low-voltage circuit breaker, it is assumed that the air plasma inside the chamber is in the state of laminar, and the laminar-model application gives quite satisfactory results compared with the experiments, while in this paper, the laminar-model application works badly. Second, the turbulence-model application in the arc research of the SF6-nozzle performs much better and gives higher arc voltage than the laminar-model application does, whereas in this paper, the turbulence-model application predicts lower arc voltage than the laminar-model application does. Based on the analysis of simulation results in detail, the mechanism of the above phenomena is revealed. The transport coefficients are strongly changed by turbulence, which will enhance the arc diffusion and make the arc volume much larger. Consequently, the arc appearance and the distribution of Lorentz force in the turbulence-model case substantially differ from the arc appearance and the distribution of Lorentz force in the laminar-model case. Thus, the moving process of the arc in the turbulence-model case is slowed down and slower than in the laminar-model case. Moreover, the more extensive arc column in the turbulence-model case reduces the total arc resistance, which results in a lower arc voltage, more consistent with the experimental results than the arc voltage in the laminar-model case. Therefore, the air plasma inside this air DCCB is believed to be in the turbulence state, and the turbulence model is more suitable than the laminar model for the arc simulation of this kind of air DCCB.« less

  10. Introduction to focus issue: Synchronization in large networks and continuous media—data, models, and supermodels

    NASA Astrophysics Data System (ADS)

    Duane, Gregory S.; Grabow, Carsten; Selten, Frank; Ghil, Michael

    2017-12-01

    The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.

  11. Introduction to focus issue: Synchronization in large networks and continuous media-data, models, and supermodels.

    PubMed

    Duane, Gregory S; Grabow, Carsten; Selten, Frank; Ghil, Michael

    2017-12-01

    The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.

  12. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.

  13. Numerical simulation of dynamics of brushless dc motors for aerospace and other applications. Volume 1: Model development and applications, part A

    NASA Technical Reports Server (NTRS)

    Demerdash, N. A. O.; Nehl, T. W.

    1979-01-01

    The development, fabrication and evaluation of a prototype electromechanical actuator (EMA) is discussed. Application of the EMA as a motor for control surfaces in aerospace flight is examined. A mathematical model of the EMA is developed for design optimization. Nonlinearities which complicate the mathematical model are discussed. The dynamics of the EMA from the underlying physical principles are determined and a discussion of similating the control logic by means of equivalent boolean expressions is presented.

  14. Application Note: Power Grid Modeling With Xyce.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sholander, Peter E.

    This application note describes how to model steady-state power flows and transient events in electric power grids with the SPICE-compatible Xyce TM Parallel Electronic Simulator developed at Sandia National Labs. This application notes provides a brief tutorial on the basic devices (branches, bus shunts, transformers and generators) found in power grids. The focus is on the features supported and assumptions made by the Xyce models for power grid elements. It then provides a detailed explanation, including working Xyce netlists, for simulating some simple power grid examples such as the IEEE 14-bus test case.

  15. A cable-driven parallel robots application: modelling and simulation of a dynamic cable model in Dymola

    NASA Astrophysics Data System (ADS)

    Othman, M. F.; Kurniawan, R.; Schramm, D.; Ariffin, A. K.

    2018-05-01

    Modeling a cable model in multibody dynamics simulation tool which dynamically varies in length, mass and stiffness is a challenging task. Simulation of cable-driven parallel robots (CDPR) for instance requires a cable model that can dynamically change in length for every desired pose of the platform. Thus, in this paper, a detailed procedure for modeling and simulation of a dynamic cable model in Dymola is proposed. The approach is also applicable for other types of Modelica simulation environments. The cable is modeled using standard mechanical elements like mass, spring, damper and joint. The parameters of the cable model are based on the factsheet of the manufacturer and experimental results. Its dynamic ability is tested by applying it on a complete planar CDPR model in which the parameters are based on a prototype named CABLAR, which is developed in Chair of Mechatronics, University of Duisburg-Essen. The prototype has been developed to demonstrate an application of CDPR as a goods storage and retrieval machine. The performance of the cable model during the simulation is analyzed and discussed.

  16. Instruction-level performance modeling and characterization of multimedia applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y.; Cameron, K.W.

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based onmore » microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.« less

  17. Tapered fiber optic applicator for laser ablation: Theoretical and experimental assessment of thermal effects on ex vivo model.

    PubMed

    Saccomandi, P; Di Matteo, F M; Schena, E; Quero, G; Massaroni, C; Giurazza, F; Costamagna, G; Silvestri, S

    2017-07-01

    Laser Ablation (LA) is a minimally invasive technique for tumor removal. The laser light is guided into the target tissue by a fiber optic applicator; thus the physical features of the applicator tip strongly influence size and shape of the tissue lesion. This study aims to verify the geometry of the lesion achieved by a tapered-tip applicator, and to investigate the percentage of thermally damaged cells induced by the tapered-tip fiber optic applicator. A theoretical model was implemented to simulate: i) the distribution of laser light fluence rate in the tissue through Monte Carlo method, ii) the induced temperature distribution, by means of the Bio Heat Equation, iii) the tissue injury, by Arrhenius integral. The results obtained by the implementation of the theoretical model were experimentally assessed. Ex vivo porcine liver underwent LA with tapered-tip applicator, at different laser settings (laser power of 1 W and 1.7 W, deposited energy equal to 330 J and 500 J, respectively). Almost spherical volume lesions were produced. The thermal damage was assessed by measuring the diameter of the circular-shaped lesion. The comparison between experimental results and theoretical prediction shows that the thermal damage discriminated by visual inspection always corresponds to a percentage of damaged cells of 96%. A tapered-tip applicator allows obtaining localized and reproducible damage close to spherical shape, whose diameter is related to the laser settings, and the simple theoretical model described is suitable to predict the effects, in terms of thermal damage, on ex vivo liver. Further trials should be addressed to adapt the model also on in vivo tissue, aiming to develop a tool useful to support the physician in clinical application of LA.

  18. Hydraulic modeling development and application in water resources engineering

    USGS Publications Warehouse

    Simoes, Francisco J.; Yang, Chih Ted; Wang, Lawrence K.

    2015-01-01

    The use of modeling has become widespread in water resources engineering and science to study rivers, lakes, estuaries, and coastal regions. For example, computer models are commonly used to forecast anthropogenic effects on the environment, and to help provide advanced mitigation measures against catastrophic events such as natural and dam-break floods. Linking hydraulic models to vegetation and habitat models has expanded their use in multidisciplinary applications to the riparian corridor. Implementation of these models in software packages on personal desktop computers has made them accessible to the general engineering community, and their use has been popularized by the need of minimal training due to intuitive graphical user interface front ends. Models are, however, complex and nontrivial, to the extent that even common terminology is sometimes ambiguous and often applied incorrectly. In fact, many efforts are currently under way in order to standardize terminology and offer guidelines for good practice, but none has yet reached unanimous acceptance. This chapter provides a view of the elements involved in modeling surface flows for the application in environmental water resources engineering. It presents the concepts and steps necessary for rational model development and use by starting with the exploration of the ideas involved in defining a model. Tangible form of those ideas is provided by the development of a mathematical and corresponding numerical hydraulic model, which is given with a substantial amount of detail. The issues of model deployment in a practical and productive work environment are also addressed. The chapter ends by presenting a few model applications highlighting the need for good quality control in model validation.

  19. A case Study of Applying Object-Relational Persistence in Astronomy Data Archiving

    NASA Astrophysics Data System (ADS)

    Yao, S. S.; Hiriart, R.; Barg, I.; Warner, P.; Gasson, D.

    2005-12-01

    The NOAO Science Archive (NSA) team is developing a comprehensive domain model to capture the science data in the archive. Java and an object model derived from the domain model weil address the application layer of the archive system. However, since RDBMS is the best proven technology for data management, the challenge is the paradigm mismatch between the object and the relational models. Transparent object-relational mapping (ORM) persistence is a successful solution to this challenge. In the data modeling and persistence implementation of NSA, we are using Hibernate, a well-accepted ORM tool, to bridge the object model in the business tier and the relational model in the database tier. Thus, the database is isolated from the Java application. The application queries directly on objects using a DBMS-independent object-oriented query API, which frees the application developers from the low level JDBC and SQL so that they can focus on the domain logic. We present the detailed design of the NSA R3 (Release 3) data model and object-relational persistence, including mapping, retrieving and caching. Persistence layer optimization and performance tuning will be analyzed. The system is being built on J2EE, so the integration of Hibernate into the EJB container and the transaction management are also explored.

  20. Application of particle and lattice codes to simulation of hydraulic fracturing

    NASA Astrophysics Data System (ADS)

    Damjanac, Branko; Detournay, Christine; Cundall, Peter A.

    2016-04-01

    With the development of unconventional oil and gas reservoirs over the last 15 years, the understanding and capability to model the propagation of hydraulic fractures in inhomogeneous and naturally fractured reservoirs has become very important for the petroleum industry (but also for some other industries like mining and geothermal). Particle-based models provide advantages over other models and solutions for the simulation of fracturing of rock masses that cannot be assumed to be continuous and homogeneous. It has been demonstrated (Potyondy and Cundall Int J Rock Mech Min Sci Geomech Abstr 41:1329-1364, 2004) that particle models based on a simple force criterion for fracture propagation match theoretical solutions and scale effects derived using the principles of linear elastic fracture mechanics (LEFM). The challenge is how to apply these models effectively (i.e., with acceptable models sizes and computer run times) to the coupled hydro-mechanical problems of relevant time and length scales for practical field applications (i.e., reservoir scale and hours of injection time). A formulation of a fully coupled hydro-mechanical particle-based model and its application to the simulation of hydraulic treatment of unconventional reservoirs are presented. Model validation by comparing with available analytical asymptotic solutions (penny-shape crack) and some examples of field application (e.g., interaction with DFN) are also included.

  1. Applicability of the 2013 ACC/AHA Risk Assessment and Cholesterol Treatment Guidelines in the real world: results from a multiethnic case-control study.

    PubMed

    Magnoni, Marco; Berteotti, Martina; Norata, Giuseppe Danilo; Limite, Luca Rosario; Peretto, Giovanni; Cristell, Nicole; Maseri, Attilio; Cianflone, Domenico

    2016-01-01

    The 2013 ACC/AHA cholesterol treatment guidelines have introduced a new cardiovascular risk assessment approach (PCE) and have revisited the threshold for prescribing statins. This study aims to compare the ex ante application of the ACC/AHA and the ATP-III guideline models by using a multiethnic case-control study. ATP-III-FRS and PCE were assessed in 739 patients with first STEMI and 739 age- and gender-matched controls; the proportion of cases and controls that would have been eligible for statin as primary prevention therapy and the discriminatory ability of both models were evaluated. The application of the ACC/AHA compared to the ATP-III model, resulted in an increase in sensitivity [94% (95%CI: 91%-95%) vs. 65% (61%-68%), p< 0.0001], a reduction in specificity [19% (15%-22%) vs. 55% (51%-59%), p< 0.0001] with similar global accuracy [0.56 (0.53-0.59) vs.0.59 (0.57-0.63), p ns]. When stratifying for ethnicity, the accuracy of the ACC/AHA model was higher in Europeans than in Chinese (p = 0.003) and to identified premature STEMI patients within Europeans much better compared to the ATP-III model (p = 0.0289). The application of the ACC/AHA model resulted in a significant reduction of first STEMI patients who would have escaped from preventive treatment. Age and ethnicity affected the accuracy of the ACC/AHA model improving the identification of premature STEMI among Europeans only. Key messages According to the ATP-III guideline model, about one-third of patients with STEMI would not be eligible for primary preventive treatment before STEMI. The application of the new ACC/AHA cholesterol treatment guideline model leads to a significant reduction of the percentage of patients with STEMI who would have been considered at lower risk before the STEMI. The global accuracy of the new ACC/AHA model is higher in the Europeans than in the Chinese and, moreover, among the Europeans, the application of the new ACC/AHA guideline model also improved identification of premature STEMI patients.

  2. Evolving PBPK applications in regulatory risk assessment: current situation and future goals

    EPA Science Inventory

    The presentation includes current applications of PBPK modeling in regulatory risk assessment and discussions on conflicts between assuring consistency with experimental data in current situation and the desire for animal-free model development.

  3. Applications manual for logit modes of express bus-fringe parking choices.

    DOT National Transportation Integrated Search

    1976-01-01

    Manual computations and computerized applications of logit models are described. The models demonstrated reflect travel behavior concerning express bus-fringe parking transit. The specific travel issues addressed include the basic automobile vs. expr...

  4. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  5. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  6. The Fringe-Imaging Skin Friction Technique PC Application User's Manual

    NASA Technical Reports Server (NTRS)

    Zilliac, Gregory G.

    1999-01-01

    A personal computer application (CXWIN4G) has been written which greatly simplifies the task of extracting skin friction measurements from interferograms of oil flows on the surface of wind tunnel models. Images are first calibrated, using a novel approach to one-camera photogrammetry, to obtain accurate spatial information on surfaces with curvature. As part of the image calibration process, an auxiliary file containing the wind tunnel model geometry is used in conjunction with a two-dimensional direct linear transformation to relate the image plane to the physical (model) coordinates. The application then applies a nonlinear regression model to accurately determine the fringe spacing from interferometric intensity records as required by the Fringe Imaging Skin Friction (FISF) technique. The skin friction is found through application of a simple expression that makes use of lubrication theory to relate fringe spacing to skin friction.

  7. Human mobility: Models and applications

    NASA Astrophysics Data System (ADS)

    Barbosa, Hugo; Barthelemy, Marc; Ghoshal, Gourab; James, Charlotte R.; Lenormand, Maxime; Louail, Thomas; Menezes, Ronaldo; Ramasco, José J.; Simini, Filippo; Tomasini, Marcello

    2018-03-01

    Recent years have witnessed an explosion of extensive geolocated datasets related to human movement, enabling scientists to quantitatively study individual and collective mobility patterns, and to generate models that can capture and reproduce the spatiotemporal structures and regularities in human trajectories. The study of human mobility is especially important for applications such as estimating migratory flows, traffic forecasting, urban planning, and epidemic modeling. In this survey, we review the approaches developed to reproduce various mobility patterns, with the main focus on recent developments. This review can be used both as an introduction to the fundamental modeling principles of human mobility, and as a collection of technical methods applicable to specific mobility-related problems. The review organizes the subject by differentiating between individual and population mobility and also between short-range and long-range mobility. Throughout the text the description of the theory is intertwined with real-world applications.

  8. A function space approach to smoothing with applications to model error estimation for flexible spacecraft control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1981-01-01

    A function space approach to smoothing is used to obtain a set of model error estimates inherent in a reduced-order model. By establishing knowledge of inevitable deficiencies in the truncated model, the error estimates provide a foundation for updating the model and thereby improving system performance. The function space smoothing solution leads to a specification of a method for computation of the model error estimates and development of model error analysis techniques for comparison between actual and estimated errors. The paper summarizes the model error estimation approach as well as an application arising in the area of modeling for spacecraft attitude control.

  9. Identifying critical nitrogen application rate for maize yield and nitrate leaching in a Haplic Luvisol soil using the DNDC model.

    PubMed

    Zhang, Yitao; Wang, Hongyuan; Liu, Shen; Lei, Qiuliang; Liu, Jian; He, Jianqiang; Zhai, Limei; Ren, Tianzhi; Liu, Hongbin

    2015-05-01

    Identification of critical nitrogen (N) application rate can provide management supports for ensuring grain yield and reducing amount of nitrate leaching to ground water. A five-year (2008-2012) field lysimeter (1 m × 2 m × 1.2 m) experiment with three N treatments (0, 180 and 240 kg Nha(-1)) was conducted to quantify maize yields and amount of nitrate leaching from a Haplic Luvisol soil in the North China Plain. The experimental data were used to calibrate and validate the process-based model of Denitrification-Decomposition (DNDC). After this, the model was used to simulate maize yield production and amount of nitrate leaching under a series of N application rates and to identify critical N application rate based on acceptable yield and amount of nitrate leaching for this cropping system. The results of model calibration and validation indicated that the model could correctly simulate maize yield and amount of nitrate leaching, with satisfactory values of RMSE-observation standard deviation ratio, model efficiency and determination coefficient. The model simulations confirmed the measurements that N application increased maize yield compared with the control, but the high N rate (240 kg Nha(-1)) did not produce more yield than the low one (120 kg Nha(-1)), and that the amount of nitrate leaching increased with increasing N application rate. The simulation results suggested that the optimal N application rate was in a range between 150 and 240 kg ha(-1), which would keep the amount of nitrate leaching below 18.4 kg NO₃(-)-Nha(-1) and meanwhile maintain acceptable maize yield above 9410 kg ha(-1). Furthermore, 180 kg Nha(-1) produced the highest yields (9837 kg ha(-1)) and comparatively lower amount of nitrate leaching (10.0 kg NO₃(-)-Nha(-1)). This study will provide a valuable reference for determining optimal N application rate (or range) in other crop systems and regions in China. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. A comprehensive overview of the applications of artificial life.

    PubMed

    Kim, Kyung-Joong; Cho, Sung-Bae

    2006-01-01

    We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.

  11. Modeling of facade leaching in urban catchments

    NASA Astrophysics Data System (ADS)

    Coutu, S.; Del Giudice, D.; Rossi, L.; Barry, D. A.

    2012-12-01

    Building facades are protected from microbial attack by incorporation of biocides within them. Flow over facades leaches these biocides and transports them to the urban environment. A parsimonious water quantity/quality model applicable for engineered urban watersheds was developed to compute biocide release from facades and their transport at the urban basin scale. The model couples two lumped submodels applicable at the basin scale, and a local model of biocide leaching at the facade scale. For the facade leaching, an existing model applicable at the individual wall scale was utilized. The two lumped models describe urban hydrodynamics and leachate transport. The integrated model allows prediction of biocide concentrations in urban rivers. It was applied to a 15 km2urban hydrosystem in western Switzerland, the Vuachère river basin, to study three facade biocides (terbutryn, carbendazim, diuron). The water quality simulated by the model matched well most of the pollutographs at the outlet of the Vuachère watershed. The model was then used to estimate possible ecotoxicological impacts of facade leachates. To this end, exceedance probabilities and cumulative pollutant loads from the catchment were estimated. Results showed that the considered biocides rarely exceeded the relevant predicted no-effect concentrations for the riverine system. Despite the heterogeneities and complexity of (engineered) urban catchments, the model application demonstrated that a computationally "light" model can be employed to simulate the hydrograph and pollutograph response within them. It thus allows catchment-scale assessment of the potential ecotoxicological impact of biocides on receiving waters.

  12. Nonlinear modeling of chaotic time series: Theory and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casdagli, M.; Eubank, S.; Farmer, J.D.

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less

  13. Reflection of a Year Long Model-Driven Business and UI Modeling Development Project

    NASA Astrophysics Data System (ADS)

    Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha

    Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.

  14. Low GWP Refrigerants Modelling Study for a Room Air Conditioner Having Microchannel Heat Exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Bo; Bhandari, Mahabir S

    Microchannel heat exchangers (MHX) have found great successes in residential and commercial air conditioning applications, being compact heat exchangers, to reduce refrigerant charge and material cost. This investigation aims to extend the application of MHXs in split, room air conditioners (RAC), per fundamental heat exchanger and system modelling. For this paper, microchannel condenser and evaporator models were developed, using a segment-to-segment modelling approach. The microchannel heat exchanger models were integrated to a system design model. The system model is able to predict the performance indices, such as cooling capacity, efficiency, sensible heat ratio, etc. Using the calibrated system and heatmore » exchanger models, we evaluated numerous low GWP (global warming potential) refrigerants. The predicted system performance indices, e.g. cooling efficiency, compressor discharge temperature, and required compressor displacement volume etc., are compared. Suitable replacements for R22 and R-410A for the room air conditioner application are recommended.« less

  15. Model tracking system for low-level radioactive waste disposal facilities: License application interrogatories and responses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benbennick, M.E.; Broton, M.S.; Fuoto, J.S.

    This report describes a model tracking system for a low-level radioactive waste (LLW) disposal facility license application. In particular, the model tracks interrogatories (questions, requests for information, comments) and responses. A set of requirements and desired features for the model tracking system was developed, including required structure and computer screens. Nine tracking systems were then reviewed against the model system requirements and only two were found to meet all requirements. Using Kepner-Tregoe decision analysis, a model tracking system was selected.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Katherine H.; Cutler, Dylan S.; Olis, Daniel R.

    REopt is a techno-economic decision support model used to optimize energy systems for buildings, campuses, communities, and microgrids. The primary application of the model is for optimizing the integration and operation of behind-the-meter energy assets. This report provides an overview of the model, including its capabilities and typical applications; inputs and outputs; economic calculations; technology descriptions; and model parameters, variables, and equations. The model is highly flexible, and is continually evolving to meet the needs of each analysis. Therefore, this report is not an exhaustive description of all capabilities, but rather a summary of the core components of the model.

  17. 40 CFR 86.1807-01 - Vehicle labeling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... XXX-Fueled 20XX Model Year New Motor Vehicles.” (B) For light-duty trucks, the statement: “This Vehicle Conforms to U.S. EPA Regulations Applicable to XXX-Fueled 20XX Model Year New Light-Duty Trucks... Applicable to XXX-fueled 20XX Model Year New Medium-Duty Passenger Vehicles.” (D) For heavy-duty vehicles...

  18. The 15-Minute Audition: Translating a Proof of Concept into a Domain-Specific Screening Device for Mathematical Talent

    ERIC Educational Resources Information Center

    Subotnik, Rena F.; Worrell, Frank C.; Olszewski-Kubilius, Paula

    2017-01-01

    In 2011, Subotnik, Olszewski-Kubilius, and Worrell proposed a conceptual model for talent development applicable to all domains. Although grounded in available psychological research, significant questions remain regarding practical applications of each tenet of the model. In this article, we highlight a method of implementing the model's focus on…

  19. Person-Fit and the Rasch Model, with an Application to Knowledge of Logical Quantors.

    ERIC Educational Resources Information Center

    Molenaar, Ivo W.; Hoijtink, Herbert

    1996-01-01

    Some specific person-fit results for the Rasch model are presented, followed by an application to a test measuring knowledge of reasoning with logical quantors. Some issues are relevant to all attempts to use person-fit statistics in research, but the special role of the Rasch model is highlighted. (SLD)

  20. Addressing HIV in the School Setting: Application of a School Change Model

    ERIC Educational Resources Information Center

    Walsh, Audra St. John; Chenneville, Tiffany

    2013-01-01

    This paper describes best practices for responding to youth with human immunodeficiency virus (HIV) in the school setting through the application of a school change model designed by the World Health Organization. This model applies a whole school approach and includes four levels that span the continuum from universal prevention to direct…

  1. An application of tensor ideas to nonlinear modeling of a turbofan jet engine

    NASA Technical Reports Server (NTRS)

    Klingler, T. A.; Yurkovich, S.; Sain, M. K.

    1982-01-01

    An application of tensor modelling to a digital simulation of NASA's Quiet, Clean, Shorthaul Experimental (QCSE) gas turbine engine is presented. The results show that the tensor algebra offers a universal parametrization which is helpful in conceptualization and identification for plant modelling prior to feedback or for representing scheduled controllers over an operating line.

  2. Comparison of crown fire modeling systems used in three fire management applications

    Treesearch

    Joe H. Scott

    2006-01-01

    The relative behavior of surface-crown fire spread rate modeling systems used in three fire management applications-CFIS (Crown Fire Initiation and Spread), FlamMap and NEXUS- is compared using fire environment characteristics derived from a dataset of destructively measured canopy fuel and associated stand characteristics. Although the surface-crown modeling systems...

  3. Understanding Language Learning: Review of the Application of the Interaction Model in Foreign Language Contexts

    ERIC Educational Resources Information Center

    Dixon, L. Quentin; Wu, Shuang

    2014-01-01

    Purpose: This paper examined the application of the input-interaction-output model in English-as-Foreign-Language (EFL) learning environments with four specific questions: (1) How do the three components function in the model? (2) Does interaction in the foreign language classroom seem to be effective for foreign language acquisition? (3) What…

  4. Skill Acquisition in Ski Instruction and the Skill Model's Application to Treating Anorexia Nervosa

    ERIC Educational Resources Information Center

    Duesund, Liv; Jespersen, Ejgil

    2004-01-01

    The Dreyfus skill model has a wide range of applications to various domains, including sport, nursing, engineering, flying, and so forth. In this article, the authors discuss the skill model in connection with two different research projects concerning ski instruction and treating anorexia nervosa. The latter project has been published but not in…

  5. Application of a Cognitive Diagnostic Model to a High-Stakes Reading Comprehension Test

    ERIC Educational Resources Information Center

    Ravand, Hamdollah

    2016-01-01

    General cognitive diagnostic models (CDM) such as the generalized deterministic input, noisy, "and" gate (G-DINA) model are flexible in that they allow for both compensatory and noncompensatory relationships among the subskills within the same test. Most of the previous CDM applications in the literature have been add-ons to simulation…

  6. Upper and Middle Atmospheric Density Modeling Requirements for Spacecraft Design and Operations

    NASA Technical Reports Server (NTRS)

    Davis, M. H. (Editor); Smith, R. E. (Editor); Johnson, D. L. (Editor)

    1987-01-01

    Presented and discussed are concerns with applications of neutral atmospheric density models to space vehicle engineering design and operational problems. The area of concern which the atmospheric model developers and the model users considered, involved middle atmosphere (50 to 90 km altitude) and thermospheric (above 90 km) models and their engineering application. Engineering emphasis involved areas such as orbital decay and lifetime prediction along with attitude and control studies for different types of space and reentry vehicles.

  7. Flow through collapsible tubes at low Reynolds numbers. Applicability of the waterfall model.

    PubMed

    Lyon, C K; Scott, J B; Wang, C Y

    1980-07-01

    The applicability of the waterfall model was tested using the Starling resistor and different viscosities of fluids to vary the Reynolds number. The waterfall model proved adequate to describe flow in the Starling resistor model only at very low Reynolds numbers (Reynolds number less than 1). Blood flow characterized by such low Reynolds numbers occurs only in the microvasculature. Thus, it is inappropriate to apply the waterfall model indiscriminately to flow through large collapsible veins.

  8. Solar energy market penetration models - Science or number mysticism

    NASA Technical Reports Server (NTRS)

    Warren, E. H., Jr.

    1980-01-01

    The forecast market potential of a solar technology is an important factor determining its R&D funding. Since solar energy market penetration models are the method used to forecast market potential, they have a pivotal role in a solar technology's development. This paper critiques the applicability of the most common solar energy market penetration models. It is argued that the assumptions underlying the foundations of rigorously developed models, or the absence of a reasonable foundation for the remaining models, restrict their applicability.

  9. State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation

    DTIC Science & Technology

    2014-07-01

    preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data

  10. TouchTerrain: A simple web-tool for creating 3D-printable topographic models

    NASA Astrophysics Data System (ADS)

    Hasiuk, Franciszek J.; Harding, Chris; Renner, Alex Raymond; Winer, Eliot

    2017-12-01

    An open-source web-application, TouchTerrain, was developed to simplify the production of 3D-printable terrain models. Direct Digital Manufacturing (DDM) using 3D Printers can change how geoscientists, students, and stakeholders interact with 3D data, with the potential to improve geoscience communication and environmental literacy. No other manufacturing technology can convert digital data into tangible objects quickly at relatively low cost; however, the expertise necessary to produce a 3D-printed terrain model can be a substantial burden: knowledge of geographical information systems, computer aided design (CAD) software, and 3D printers may all be required. Furthermore, printing models larger than the build volume of a 3D printer can pose further technical hurdles. The TouchTerrain web-application simplifies DDM for elevation data by generating digital 3D models customized for a specific 3D printer's capabilities. The only required user input is the selection of a region-of-interest using the provided web-application with a Google Maps-style interface. Publically available digital elevation data is processed via the Google Earth Engine API. To allow the manufacture of 3D terrain models larger than a 3D printer's build volume the selected area can be split into multiple tiles without third-party software. This application significantly reduces the time and effort required for a non-expert like an educator to obtain 3D terrain models for use in class. The web application is deployed at http://touchterrain.geol.iastate.edu/.

  11. Development of computational small animal models and their applications in preclinical imaging and therapy research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less

  12. The Vulnerability, Impacts, Adaptation and Climate Services Advisory Board (VIACS AB V1.0) Contribution to CMIP6

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; Teichmann, Claas; Arnell, Nigel W.; Carter, Timothy R.; Ebi, Kristie L.; Frieler, Katja; Goodess, Clare M.; Hewitson, Bruce; Horton, Radley; Kovats, R. Sari; hide

    2016-01-01

    This paper describes the motivation for the creation of the Vulnerability, Impacts, Adaptation and Climate Services (VIACS) Advisory Board for the Sixth Phase of the Coupled Model Intercomparison Project (CMIP6), its initial activities, and its plans to serve as a bridge between climate change applications experts and climate modelers. The climate change application community comprises researchers and other specialists who use climate information (alongside socioeconomic and other environmental information) to analyze vulnerability, impacts, and adaptation of natural systems and society in relation to past, ongoing, and projected future climate change. Much of this activity is directed toward the co-development of information needed by decisionmakers for managing projected risks. CMIP6 provides a unique opportunity to facilitate a two-way dialog between climate modelers and VIACS experts who are looking to apply CMIP6 results for a wide array of research and climate services objectives. The VIACS Advisory Board convenes leaders of major impact sectors, international programs, and climate services to solicit community feedback that increases the applications relevance of the CMIP6-Endorsed Model Intercomparison Projects (MIPs). As an illustration of its potential, the VIACS community provided CMIP6 leadership with a list of prioritized climate model variables and MIP experiments of the greatest interest to the climate model applications community, indicating the applicability and societal relevance of climate model simulation outputs. The VIACS Advisory Board also recommended an impacts version of Obs4MIPs (observational datasets) and indicated user needs for the gridding and processing of model output.

  13. Application of the Aquifer Impact Model to support decisions at a CO 2 sequestration site: Modeling and Analysis: Application of the Aquifer Impact Model to support decisions at a CO 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacon, Diana Holford; Locke II, Randall A.; Keating, Elizabeth

    The National Risk Assessment Partnership (NRAP) has developed a suite of tools to assess and manage risk at CO2 sequestration sites (1). The NRAP tool suite includes the Aquifer Impact Model (AIM), based on reduced order models developed using site-specific data from two aquifers (alluvium and carbonate). The models accept aquifer parameters as a range of variable inputs so they may have more broad applicability. Guidelines have been developed for determining the aquifer types for which the ROMs should be applicable. This paper considers the applicability of the aquifer models in AIM to predicting the impact of CO2 or Brinemore » leakage were it to occur at the Illinois Basin Decatur Project (IBDP). Based on the results of the sensitivity analysis, the hydraulic parameters and leakage source term magnitude are more sensitive than clay fraction or cation exchange capacity. Sand permeability was the only hydraulic parameter measured at the IBDP site. More information on the other hydraulic parameters, such as sand fraction and sand/clay correlation lengths, could reduce uncertainty in risk estimates. Some non-adjustable parameters, such as the initial pH and TDS and the pH no-impact threshold, are significantly different for the ROM than for the observations at the IBDP site. The reduced order model could be made more useful to a wider range of sites if the initial conditions and no-impact threshold values were adjustable parameters.« less

  14. The Vulnerability, Impacts, Adaptation and Climate Services Advisory Board (VIACS AB v1.0) contribution to CMIP6

    NASA Astrophysics Data System (ADS)

    Ruane, Alex C.; Teichmann, Claas; Arnell, Nigel W.; Carter, Timothy R.; Ebi, Kristie L.; Frieler, Katja; Goodess, Clare M.; Hewitson, Bruce; Horton, Radley; Sari Kovats, R.; Lotze, Heike K.; Mearns, Linda O.; Navarra, Antonio; Ojima, Dennis S.; Riahi, Keywan; Rosenzweig, Cynthia; Themessl, Matthias; Vincent, Katharine

    2016-09-01

    This paper describes the motivation for the creation of the Vulnerability, Impacts, Adaptation and Climate Services (VIACS) Advisory Board for the Sixth Phase of the Coupled Model Intercomparison Project (CMIP6), its initial activities, and its plans to serve as a bridge between climate change applications experts and climate modelers. The climate change application community comprises researchers and other specialists who use climate information (alongside socioeconomic and other environmental information) to analyze vulnerability, impacts, and adaptation of natural systems and society in relation to past, ongoing, and projected future climate change. Much of this activity is directed toward the co-development of information needed by decision-makers for managing projected risks. CMIP6 provides a unique opportunity to facilitate a two-way dialog between climate modelers and VIACS experts who are looking to apply CMIP6 results for a wide array of research and climate services objectives. The VIACS Advisory Board convenes leaders of major impact sectors, international programs, and climate services to solicit community feedback that increases the applications relevance of the CMIP6-Endorsed Model Intercomparison Projects (MIPs). As an illustration of its potential, the VIACS community provided CMIP6 leadership with a list of prioritized climate model variables and MIP experiments of the greatest interest to the climate model applications community, indicating the applicability and societal relevance of climate model simulation outputs. The VIACS Advisory Board also recommended an impacts version of Obs4MIPs and indicated user needs for the gridding and processing of model output.

  15. LANL* V1.0: a radiation belt drift shell model suitable for real-time and reanalysis applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koller, Josep; Reeves, Geoffrey D; Friedel, Reiner H W

    2008-01-01

    Space weather modeling, forecasts, and predictions, especially for the radiation belts in the inner magnetosphere, require detailed information about the Earth's magnetic field. Results depend on the magnetic field model and the L* (pron. L-star) values which are used to describe particle drift shells. Space wather models require integrating particle motions along trajectories that encircle the Earth. Numerical integration typically takes on the order of 10{sup 5} calls to a magnetic field model which makes the L* calculations very slow, in particular when using a dynamic and more accurate magnetic field model. Researchers currently tend to pick simplistic models overmore » more accurate ones but also risking large inaccuracies and even wrong conclusions. For example, magnetic field models affect the calculation of electron phase space density by applying adiabatic invariants including the drift shell value L*. We present here a new method using a surrogate model based on a neural network technique to replace the time consuming L* calculations made with modern magnetic field models. The advantage of surrogate models (or meta-models) is that they can compute the same output in a fraction of the time while adding only a marginal error. Our drift shell model LANL* (Los Alamos National Lab L-star) is based on L* calculation using the TSK03 model. The surrogate model has currently been tested and validated only for geosynchronous regions but the method is generally applicable to any satellite orbit. Computations with the new model are several million times faster compared to the standard integration method while adding less than 1% error. Currently, real-time applications for forecasting and even nowcasting inner magnetospheric space weather is limited partly due to the long computing time of accurate L* values. Without them, real-time applications are limited in accuracy. Reanalysis application of past conditions in the inner magnetosphere are used to understand physical processes and their effect. Without sufficiently accurate L* values, the interpretation of reanalysis results becomes difficult and uncertain. However, with a method that can calculate accurate L* values orders of magnitude faster, analyzing whole solar cycles worth of data suddenly becomes feasible.« less

  16. Dynamic modeling of brushless dc motor-power conditioner unit for electromechanical actuator application

    NASA Technical Reports Server (NTRS)

    Demerdash, N. A.; Nehl, T. W.

    1979-01-01

    A comprehensive digital model for the analysis of the dynamic-instantaneous performance of a power conditioner fed samarium-cobalt permanent magnet brushless DC motor is presented. The particular power conditioner-machine system at hand, for which this model was developed, is a component of an actual prototype electromechanical actuator built for NASA-JSC as a possible alternative to hydraulic actuators as part of feasibility studies for the shuttle orbiter applications. Excellent correlation between digital simulated and experimentally obtained performance data was achieved for this specific prototype. This is reported on in this paper. Details of one component of the model, its applications and the corresponding results are given in this paper.

  17. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: McDonnell-Douglas Helicopter Company achievements

    NASA Technical Reports Server (NTRS)

    Toossi, Mostafa; Weisenburger, Richard; Hashemi-Kia, Mostafa

    1993-01-01

    This paper presents a summary of some of the work performed by McDonnell Douglas Helicopter Company under NASA Langley-sponsored rotorcraft structural dynamics program known as DAMVIBS (Design Analysis Methods for VIBrationS). A set of guidelines which is applicable to dynamic modeling, analysis, testing, and correlation of both helicopter airframes and a large variety of structural finite element models is presented. Utilization of these guidelines and the key features of their applications to vibration modeling of helicopter airframes are discussed. Correlation studies with the test data, together with the development and applications of a set of efficient finite element model checkout procedures, are demonstrated on a large helicopter airframe finite element model. Finally, the lessons learned and the benefits resulting from this program are summarized.

  18. A modified elastic foundation contact model for application in 3D models of the prosthetic knee.

    PubMed

    Pérez-González, Antonio; Fenollosa-Esteve, Carlos; Sancho-Bru, Joaquín L; Sánchez-Marín, Francisco T; Vergara, Margarita; Rodríguez-Cervantes, Pablo J

    2008-04-01

    Different models have been used in the literature for the simulation of surface contact in biomechanical knee models. However, there is a lack of systematic comparisons of these models applied to the simulation of a common case, which will provide relevant information about their accuracy and suitability for application in models of the implanted knee. In this work a comparison of the Hertz model (HM), the elastic foundation model (EFM) and the finite element model (FEM) for the simulation of the elastic contact in a 3D model of the prosthetic knee is presented. From the results of this comparison it is found that although the nature of the EFM offers advantages when compared with that of the HM for its application to realistic prosthetic surfaces, and when compared with the FEM in CPU time, its predictions can differ from FEM in some circumstances. These differences are considerable if the comparison is performed for prescribed displacements, although they are less important for prescribed loads. To solve these problems a new modified elastic foundation model (mEFM) is proposed that maintains basically the simplicity of the original model while producing much more accurate results. In this paper it is shown that this new mEFM calculates pressure distribution and contact area with accuracy and short computation times for toroidal contacting surfaces. Although further work is needed to confirm its validity for more complex geometries the mEFM is envisaged as a good option for application in 3D knee models to predict prosthetic knee performance.

  19. Implementing Multidisciplinary and Multi-Zonal Applications Using MPI

    NASA Technical Reports Server (NTRS)

    Fineberg, Samuel A.

    1995-01-01

    Multidisciplinary and multi-zonal applications are an important class of applications in the area of Computational Aerosciences. In these codes, two or more distinct parallel programs or copies of a single program are utilized to model a single problem. To support such applications, it is common to use a programming model where a program is divided into several single program multiple data stream (SPMD) applications, each of which solves the equations for a single physical discipline or grid zone. These SPMD applications are then bound together to form a single multidisciplinary or multi-zonal program in which the constituent parts communicate via point-to-point message passing routines. Unfortunately, simple message passing models, like Intel's NX library, only allow point-to-point and global communication within a single system-defined partition. This makes implementation of these applications quite difficult, if not impossible. In this report it is shown that the new Message Passing Interface (MPI) standard is a viable portable library for implementing the message passing portion of multidisciplinary applications. Further, with the extension of a portable loader, fully portable multidisciplinary application programs can be developed. Finally, the performance of MPI is compared to that of some native message passing libraries. This comparison shows that MPI can be implemented to deliver performance commensurate with native message libraries.

  20. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    PubMed Central

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  1. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  2. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  3. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    PubMed

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  4. Degree-day model for timing insecticide applications to control Dioryctria amatella (Lepidoptera: Pyralidae) in loblolly pine seed orchards

    Treesearch

    James L. Hanula; Gary L. DeBarr; Julie C. Weatherby; Larry R. Barber; C. Wayne Berisford

    2002-01-01

    Because Dioryctria amatella (Hulst) is a key pest in loblolly pine, Pinus taeda L. (Pinaceac), seed orchards in the southeastern United States, improved timing of insecticide applications would be valuable for its control. To time two fenvalerate (Pydrin® 2.4 EC) applications we tested four variations of a degree day model that...

  5. Stochastic Hybrid Systems Modeling and Middleware-enabled DDDAS for Next-generation US Air Force Systems

    DTIC Science & Technology

    2017-03-30

    experimental evaluations for hosting DDDAS-like applications in public cloud infrastructures . Finally, we report on ongoing work towards using the DDDAS...developed and their experimental evaluations for hosting DDDAS-like applications in public cloud infrastructures . Finally, we report on ongoing work towards...Dynamic resource management, model learning, simulation-based optimizations, cloud infrastructures for DDDAS applications. I. INTRODUCTION Critical cyber

  6. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  7. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  8. Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality

    NASA Astrophysics Data System (ADS)

    Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas

    Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.

  9. The public health nutrition intervention management bi-cycle: a model for training and practice improvement.

    PubMed

    Hughes, Roger; Margetts, Barrie

    2012-11-01

    The present paper describes a model for public health nutrition practice designed to facilitate practice improvement and provide a step-wise approach to assist with workforce development. The bi-cycle model for public health nutrition practice has been developed based on existing cyclical models for intervention management but modified to integrate discrete capacity-building practices. Education and practice settings. This model will have applications for educators and practitioners. Modifications to existing models have been informed by the authors' observations and experiences as practitioners and educators, and reflect a conceptual framework with applications in workforce development and practice improvement. From a workforce development and educational perspective, the model is designed to reflect adult learning principles, exposing students to experiential, problem-solving and practical learning experiences that reflect the realities of work as a public health nutritionist. In doing so, it assists the development of competency beyond knowing to knowing how, showing how and doing. This progression of learning from knowledge to performance is critical to effective competency development for effective practice. Public health nutrition practice is dynamic and varied, and models need to be adaptable and applicable to practice context to have utility. The paper serves to stimulate debate in the public health nutrition community, to encourage critical feedback about the validity, applicability and utility of this model in different practice contexts.

  10. Strategies for Energy Efficient Resource Management of Hybrid Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong; Supinski, Bronis de; Schulz, Martin

    2013-01-01

    Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less

  11. Lord-Wingersky Algorithm Version 2.0 for Hierarchical Item Factor Models with Applications in Test Scoring, Scale Alignment, and Model Fit Testing.

    PubMed

    Cai, Li

    2015-06-01

    Lord and Wingersky's (Appl Psychol Meas 8:453-461, 1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined on a grid formed by direct products of quadrature points. However, the increase in computational burden remains exponential in the number of dimensions, making the implementation of the recursive algorithm cumbersome for truly high-dimensional models. In this paper, a dimension reduction method that is specific to the Lord-Wingersky recursions is developed. This method can take advantage of the restrictions implied by hierarchical item factor models, e.g., the bifactor model, the testlet model, or the two-tier model, such that a version of the Lord-Wingersky recursive algorithm can operate on a dramatically reduced set of quadrature points. For instance, in a bifactor model, the dimension of integration is always equal to 2, regardless of the number of factors. The new algorithm not only provides an effective mechanism to produce summed score to IRT scaled score translation tables properly adjusted for residual dependence, but leads to new applications in test scoring, linking, and model fit checking as well. Simulated and empirical examples are used to illustrate the new applications.

  12. Application of postured human model for SAR measurements

    NASA Astrophysics Data System (ADS)

    Vuchkovikj, M.; Munteanu, I.; Weiland, T.

    2013-07-01

    In the last two decades, the increasing number of electronic devices used in day-to-day life led to a growing interest in the study of the electromagnetic field interaction with biological tissues. The design of medical devices and wireless communication devices such as mobile phones benefits a lot from the bio-electromagnetic simulations in which digital human models are used. The digital human models currently available have an upright position which limits the research activities in realistic scenarios, where postured human bodies must be considered. For this reason, a software application called "BodyFlex for CST STUDIO SUITE" was developed. In its current version, this application can deform the voxel-based human model named HUGO (Dipp GmbH, 2010) to allow the generation of common postures that people use in normal life, ensuring the continuity of tissues and conserving the mass to an acceptable level. This paper describes the enhancement of the "BodyFlex" application, which is related to the movements of the forearm and the wrist of a digital human model. One of the electromagnetic applications in which the forearm and the wrist movement of a voxel based human model has a significant meaning is the measurement of the specific absorption rate (SAR) when a model is exposed to a radio frequency electromagnetic field produced by a mobile phone. Current SAR measurements of the exposure from mobile phones are performed with the SAM (Specific Anthropomorphic Mannequin) phantom which is filled with a dispersive but homogeneous material. We are interested what happens with the SAR values if a realistic inhomogeneous human model is used. To this aim, two human models, a homogeneous and an inhomogeneous one, in two simulation scenarios are used, in order to examine and observe the differences in the results for the SAR values.

  13. Application of Recent Advances in Forward Modeling of Emissions from Boreal and Temperate Wildfires to Real-time Forecasting of Aerosol and Trace Gas Concentrations

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Reid, J. S.; Kasischke, E. S.; Allen, D. J.

    2005-12-01

    The magnitude of trace gas and aerosol emissions from wildfires is a scientific problem with important implications for atmospheric composition, and is also integral to understanding carbon cycling in terrestrial ecosystems. Recent ecological research on modeling wildfire emissions has integrated theoretical advances derived from ecological fieldwork with improved spatial and temporal databases to produce "post facto" estimates of emissions with high spatial and temporal resolution. These advances have been shown to improve agreement with atmospheric observations at coarse scales, but can in principle be applied to applications, such as forecasting, at finer scales. However, several of the approaches employed in these forward models are incompatible with the requirements of real-time forecasting, requiring modification of data inputs and calculation methods. Because of the differences in data inputs used for real-time and "post-facto" emissions modeling, the key uncertainties in the forward problem are not necessarily the same for these two applications. However, adaptation of these advances in forward modeling to forecasting applications has the potential to improve air quality forecasts, and also to provide a large body of experimental data which can be used to constrain crucial uncertainties in current conceptual models of wildfire emissions. This talk describes a forward modeling method developed at the University of Maryland and its application to the Fire Locating and Modeling of Burning Emissions (FLAMBE) system at the Naval Research Laboratory. Methods for applying the outputs of the NRL aerosol forecasting system to the inverse problem of constraining emissions will also be discussed. The system described can use the feedback supplied by atmospheric observations to improve the emissions source description in the forecasting model, and can also be used for hypothesis testing regarding fire behavior and data inputs.

  14. Protein loop modeling using a new hybrid energy function and its application to modeling in inaccurate structural environments.

    PubMed

    Park, Hahnbeom; Lee, Gyu Rie; Heo, Lim; Seok, Chaok

    2014-01-01

    Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.

  15. Automated Deployment of Advanced Controls and Analytics in Buildings

    NASA Astrophysics Data System (ADS)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  16. Mobile-Based Dictionary of Information and Communication Technology

    NASA Astrophysics Data System (ADS)

    Liando, O. E. S.; Mewengkang, A.; Kaseger, D.; Sangkop, F. I.; Rantung, V. P.; Rorimpandey, G. C.

    2018-02-01

    This study aims to design and build mobile-based dictionary of information and communication technology applications to provide access to information in the form of glossary of terms in the context of information and communication technologies. Applications built in this study using the Android platform, with SQLite database model. This research uses prototype model development method which covers the stages of communication, Quick Plan, Quick Design Modeling, Construction of Prototype, Deployment Delivery & Feedback, and Full System Transformation. The design of this application is designed in such a way as to facilitate the user in the process of learning and understanding the new terms or vocabularies encountered in the world of information and communication technology. Mobile-based dictionary of Information And Communication Technology applications that have been built can be an alternative to learning literature. In its simplest form, this application is able to meet the need for a comprehensive and accurate dictionary of Information And Communication Technology function.

  17. Highest weight representation for Sklyanin algebra sl(3)(u) with application to the Gaudin model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burdik, C., E-mail: burdik@kmlinux.fjfi.cvut.cz; Navratil, O.

    2011-06-15

    We study the infinite-dimensional Sklyanin algebra sl(3)(u). Specifically we construct the highest weight representation for this algebra in an explicit form. Its application to the Gaudin model is mentioned.

  18. A practical guide on DTA model applications for regional planning

    DOT National Transportation Integrated Search

    2016-06-07

    This document is intended as a guide for use by Metropolitan Planning Organizations (MPO) and other planning agencies that are interested in applying Dynamic Traffic Assignment (DTA) models for planning applications. The objective of this document is...

  19. The structural bioinformatics library: modeling in biomolecular science and beyond.

    PubMed

    Cazals, Frédéric; Dreyfus, Tom

    2017-04-01

    Software in structural bioinformatics has mainly been application driven. To favor practitioners seeking off-the-shelf applications, but also developers seeking advanced building blocks to develop novel applications, we undertook the design of the Structural Bioinformatics Library ( SBL , http://sbl.inria.fr ), a generic C ++/python cross-platform software library targeting complex problems in structural bioinformatics. Its tenet is based on a modular design offering a rich and versatile framework allowing the development of novel applications requiring well specified complex operations, without compromising robustness and performances. The SBL involves four software components (1-4 thereafter). For end-users, the SBL provides ready to use, state-of-the-art (1) applications to handle molecular models defined by unions of balls, to deal with molecular flexibility, to model macro-molecular assemblies. These applications can also be combined to tackle integrated analysis problems. For developers, the SBL provides a broad C ++ toolbox with modular design, involving core (2) algorithms , (3) biophysical models and (4) modules , the latter being especially suited to develop novel applications. The SBL comes with a thorough documentation consisting of user and reference manuals, and a bugzilla platform to handle community feedback. The SBL is available from http://sbl.inria.fr. Frederic.Cazals@inria.fr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Current developments in soil organic matter modeling and the expansion of model applications: a review

    NASA Astrophysics Data System (ADS)

    Campbell, Eleanor E.; Paustian, Keith

    2015-12-01

    Soil organic matter (SOM) is an important natural resource. It is fundamental to soil and ecosystem functions across a wide range of scales, from site-specific soil fertility and water holding capacity to global biogeochemical cycling. It is also a highly complex material that is sensitive to direct and indirect human impacts. In SOM research, simulation models play an important role by providing a mathematical framework to integrate, examine, and test the understanding of SOM dynamics. Simulation models of SOM are also increasingly used in more ‘applied’ settings to evaluate human impacts on ecosystem function, and to manage SOM for greenhouse gas mitigation, improved soil health, and sustainable use as a natural resource. Within this context, there is a need to maintain a robust connection between scientific developments in SOM modeling approaches and SOM model applications. This need forms the basis of this review. In this review we first provide an overview of SOM modeling, focusing on SOM theory, data-model integration, and model development as evidenced by a quantitative review of SOM literature. Second, we present the landscape of SOM model applications, focusing on examples in climate change policy. We conclude by discussing five areas of recent developments in SOM modeling including: (1) microbial roles in SOM stabilization; (2) modeling SOM saturation kinetics; (3) temperature controls on decomposition; (4) SOM dynamics in deep soil layers; and (5) SOM representation in earth system models. Our aim is to comprehensively connect SOM model development to its applications, revealing knowledge gaps in need of focused interdisciplinary attention and exposing pitfalls that, if avoided, can lead to best use of SOM models to support policy initiatives and sustainable land management solutions.

  1. A review of model applications for structured soils: b) Pesticide transport.

    PubMed

    Köhne, John Maximilian; Köhne, Sigrid; Simůnek, Jirka

    2009-02-16

    The past decade has seen considerable progress in the development of models simulating pesticide transport in structured soils subject to preferential flow (PF). Most PF pesticide transport models are based on the two-region concept and usually assume one (vertical) dimensional flow and transport. Stochastic parameter sets are sometimes used to account for the effects of spatial variability at the field scale. In the past decade, PF pesticide models were also coupled with Geographical Information Systems (GIS) and groundwater flow models for application at the catchment and larger regional scales. A review of PF pesticide model applications reveals that the principal difficulty of their application is still the appropriate parameterization of PF and pesticide processes. Experimental solution strategies involve improving measurement techniques and experimental designs. Model strategies aim at enhancing process descriptions, studying parameter sensitivity, uncertainty, inverse parameter identification, model calibration, and effects of spatial variability, as well as generating model emulators and databases. Model comparison studies demonstrated that, after calibration, PF pesticide models clearly outperform chromatographic models for structured soils. Considering nonlinear and kinetic sorption reactions further enhanced the pesticide transport description. However, inverse techniques combined with typically available experimental data are often limited in their ability to simultaneously identify parameters for describing PF, sorption, degradation and other processes. On the other hand, the predictive capacity of uncalibrated PF pesticide models currently allows at best an approximate (order-of-magnitude) estimation of concentrations. Moreover, models should target the entire soil-plant-atmosphere system, including often neglected above-ground processes such as pesticide volatilization, interception, sorption to plant residues, root uptake, and losses by runoff. The conclusions compile progress, problems, and future research choices for modelling pesticide displacement in structured soils.

  2. Current developments in soil organic matter modeling and the expansion of model applications. A review

    DOE PAGES

    Campbell, Eleanor E.; Paustian, Keith

    2015-12-23

    It is important to note that Soil organic matter (SOM) is a great natural resource. It is fundamental to soil and ecosystem functions across a wide range of scales, from site-specific soil fertility and water holding capacity to global biogeochemical cycling. It is also a highly complex material that is sensitive to direct and indirect human impacts. In our SOM research, simulation models play an important role by providing a mathematical framework to integrate, examine, and test the understanding of SOM dynamics. Simulation models of SOM are also increasingly used in more ‘applied’ settings to evaluate human impacts on ecosystemmore » function, and to manage SOM for greenhouse gas mitigation, improved soil health, and sustainable use as a natural resource. Within this context, there is a need to maintain a robust connection between scientific developments in SOM modeling approaches and SOM model applications. This need forms the basis of this review. In this review we first provide an overview of SOM modeling, focusing on SOM theory, data-model integration, and model development as evidenced by a quantitative review of SOM literature. Second, we present the landscape of SOM model applications, focusing on examples in climate change policy. Finally, we conclude by discussing five areas of recent developments in SOM modeling including: (1) microbial roles in SOM stabilization; (2) modeling SOM saturation kinetics; (3) temperature controls on decomposition; (4)SOM dynamics in deep soil layers; and (5)SOM representation in earth system models. Our aim is to comprehensively connect SOM model development to its applications, revealing knowledge gaps in need of focused interdisciplinary attention and exposing pitfalls that, if avoided, can lead to best use of SOM models to support policy initiatives and sustainable land management solutions.« less

  3. Feed Safe: a multidisciplinary partnership approach results in a successful mobile application for breastfeeding mothers.

    PubMed

    White, Becky; White, James; Giglia, Roslyn; Tawia, Susan

    2016-05-30

    Issue addressed: Mobile applications are increasingly being used in health promotion initiatives. Although there is evidence that developing these mobile health applications in multidisciplinary teams is good practice, there is a gap in the literature with respect to evaluation of the process of this partnership model and how best to disseminate the application into the community. The aim of this paper is twofold, to describe the partnership model in which the Feed Safe application was developed and to investigate what worked in terms of dissemination. Methods: The process of working in partnership was measured using the VicHealth partnership analysis tool for health promotion. The dissemination strategy and reach of the application was measured using both automated analytics data and estimates of community-initiated promotion. Results: The combined average score from the partnership analysis tool was 138 out of a possible 175. A multipronged dissemination strategy led to good uptake of the application among Australian women. Conclusions: Multidisciplinary partnership models are important in the development of health promotion mobile applications. Recognising and utilising the skills of each partner organisation can help expand the reach of mobile health applications into the Australian population and aid in good uptake of health promotion resources. So what?: Developing mobile applications in multidisciplinary partnerships is good practice and can lead to wide community uptake of the health promotion resource.

  4. Development and Application of Numerical Models for Reactive Flows

    DTIC Science & Technology

    1990-08-15

    Shear Layers: Ill. Effect of Convective Mach number Raafat H. Guirguis Abstract Model This paper addresses some of the fundamental We have made the...OTIC FILE COPY / 0 00 DTIC N~l 9 ELECTE D CbBA9-OI Development and Application of Numerical Models for Reactive Flows Berkeley Research Associates...Laboratory for Computa- tional Physics (LCP), hav focused on developing mathematical and computational models which accurately and efficiently describe the

  5. Discussion on accuracy degree evaluation of accident velocity reconstruction model

    NASA Astrophysics Data System (ADS)

    Zou, Tiefang; Dai, Yingbiao; Cai, Ming; Liu, Jike

    In order to investigate the applicability of accident velocity reconstruction model in different cases, a method used to evaluate accuracy degree of accident velocity reconstruction model is given. Based on pre-crash velocity in theory and calculation, an accuracy degree evaluation formula is obtained. With a numerical simulation case, Accuracy degrees and applicability of two accident velocity reconstruction models are analyzed; results show that this method is feasible in practice.

  6. Interpretation of commonly used statistical regression models.

    PubMed

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  7. A depictive neural model for the representation of motion verbs.

    PubMed

    Rao, Sunil; Aleksander, Igor

    2011-11-01

    In this paper, we present a depictive neural model for the representation of motion verb semantics in neural models of visual awareness. The problem of modelling motion verb representation is shown to be one of function application, mapping a set of given input variables defining the moving object and the path of motion to a defined output outcome in the motion recognition context. The particular function-applicative implementation and consequent recognition model design presented are seen as arising from a noun-adjective recognition model enabling the recognition of colour adjectives as applied to a set of shapes representing objects to be recognised. The presence of such a function application scheme and a separately implemented position identification and path labelling scheme are accordingly shown to be the primitives required to enable the design and construction of a composite depictive motion verb recognition scheme. Extensions to the presented design to enable the representation of transitive verbs are also discussed.

  8. Hybrid model for simulation of plasma jet injection in tokamak

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.

    2016-10-01

    Hybrid kinetic model of plasma treats the ions as kinetic particles and the electrons as charge neutralizing massless fluid. The model is essentially applicable when most of the energy is concentrated in the ions rather than in the electrons, i.e. it is well suited for the high-density hyper-velocity C60 plasma jet. The hybrid model separates the slower ion time scale from the faster electron time scale, which becomes disregardable. That is why hybrid codes consistently outperform the traditional PIC codes in computational efficiency, still resolving kinetic ions effects. We discuss 2D hybrid model and code with exact energy conservation numerical algorithm and present some results of its application to simulation of C60 plasma jet penetration through tokamak-like magnetic barrier. We also examine the 3D model/code extension and its possible applications to tokamak and ionospheric plasmas. The work is supported in part by US DOE DE-SC0015776 Grant.

  9. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  10. Statistical Analysis of Q-matrix Based Diagnostic Classification Models

    PubMed Central

    Chen, Yunxiao; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2014-01-01

    Diagnostic classification models have recently gained prominence in educational assessment, psychiatric evaluation, and many other disciplines. Central to the model specification is the so-called Q-matrix that provides a qualitative specification of the item-attribute relationship. In this paper, we develop theories on the identifiability for the Q-matrix under the DINA and the DINO models. We further propose an estimation procedure for the Q-matrix through the regularized maximum likelihood. The applicability of this procedure is not limited to the DINA or the DINO model and it can be applied to essentially all Q-matrix based diagnostic classification models. Simulation studies are conducted to illustrate its performance. Furthermore, two case studies are presented. The first case is a data set on fraction subtraction (educational application) and the second case is a subsample of the National Epidemiological Survey on Alcohol and Related Conditions concerning the social anxiety disorder (psychiatric application). PMID:26294801

  11. Predictive QSAR modeling workflow, model applicability domains, and virtual screening.

    PubMed

    Tropsha, Alexander; Golbraikh, Alexander

    2007-01-01

    Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.

  12. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  13. Continuum Fatigue Damage Modeling for Use in Life Extending Control

    NASA Technical Reports Server (NTRS)

    Lorenzo, Carl F.

    1994-01-01

    This paper develops a simplified continuum (continuous wrp to time, stress, etc.) fatigue damage model for use in Life Extending Controls (LEC) studies. The work is based on zero mean stress local strain cyclic damage modeling. New nonlinear explicit equation forms of cyclic damage in terms of stress amplitude are derived to facilitate the continuum modeling. Stress based continuum models are derived. Extension to plastic strain-strain rate models are also presented. Application of these models to LEC applications is considered. Progress toward a nonzero mean stress based continuum model is presented. Also, new nonlinear explicit equation forms in terms of stress amplitude are also derived for this case.

  14. Impact of Climate Change on Soil and Groundwater Chemistry Subject to Process Waste Land Application

    NASA Astrophysics Data System (ADS)

    McNab, W. W.

    2013-12-01

    Nonhazardous aqueous process waste streams from food and beverage industry operations are often discharged via managed land application in a manner designed to minimize impacts to underlying groundwater. Process waste streams are typically characterized by elevated concentrations of solutes such as ammonium, organic nitrogen, potassium, sodium, and organic acids. Land application involves the mixing of process waste streams with irrigation water which is subsequently applied to crops. The combination of evapotranspiration and crop salt uptake reduces the downward mass fluxes of percolation water and salts. By carefully managing application schedules in the context of annual climatological cycles, growing seasons, and process requirements, potential adverse environmental impacts to groundwater can be mitigated. However, climate change poses challenges to future process waste land application efforts because the key factors that determine loading rates - temperature, evapotranspiration, seasonal changes in the quality and quantity of applied water, and various crop factors - are all likely to deviate from current averages. To assess the potential impact of future climate change on the practice of land application, coupled process modeling entailing transient unsaturated fluid flow, evapotranspiration, crop salt uptake, and multispecies reactive chemical transport was used to predict changes in salt loading if current practices are maintained in a warmer, drier setting. As a first step, a coupled process model (Hydrus-1D, combined with PHREEQC) was calibrated to existing data sets which summarize land application loading rates, soil water chemistry, and crop salt uptake for land disposal of process wastes from a food industry facility in the northern San Joaquin Valley of California. Model results quantify, for example, the impacts of evapotranspiration on both fluid flow and soil water chemistry at shallow depths, with secondary effects including carbonate mineral precipitation and ion exchange. The calibrated model was then re-run assuming different evapotranspiration and crop growth regimes, and different seasonally-adjusted applied water compositions, to elucidate possible impacts to salt loading reactive chemistry. The results of the predictive modeling indicate the extent to which salts could be redistributed within the soil column as a consequence of climate change. The degree to which these findings are applicable to process waste land application operations at other sites was explored by varying the soil unsaturated flow parameters as a model sensitivity assessment. Taken together, the model results help to quantify operational changes to land application that may be necessary to avoid future adverse environmental impacts to soil and groundwater.

  15. Puget Sound Applications of the VELMA Ecohydrological Model

    EPA Science Inventory

    This seminar will present an overview of EPA’s Visualizing Ecosystem Land Management Assessments (VELMA) model and its applications in the Puget Sound Basin. Topics will include a description of how VELMA simulates the interaction of hydrological and biogeochemical processe...

  16. APPLICATION OF STABLE ISOTOPE TECHNIQUES TO AIR POLLUTION RESEARCH

    EPA Science Inventory

    Stable isotope techniques provide a robust, yet under-utilized tool for examining pollutant effects on plant growth and ecosystem function. Here, we survey a range of mixing model, physiological and system level applications for documenting pollutant effects. Mixing model examp...

  17. Survey of Fire Modeling Efforts with Application to Transportation Vehicles

    DOT National Transportation Integrated Search

    1981-07-01

    This report presents the results of a survey of analytical fire models with applications pertinent to fires in the compartments of transportation vehicles; a brief discussion of the background of fire phenomena and an overview of various fire modelin...

  18. 40 CFR 600.501-86 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy and Manufacturer's... subpart are applicable to 1986 and later model year gasoline-fueled and diesel automobiles. (b)(1...

  19. AERIS - applications for the environment : real-time information synthesis : eco-signal operations modeling report.

    DOT National Transportation Integrated Search

    2014-12-01

    This report constitutes the detailed modeling and evaluation results of the Eco-Signal Operations Operational Scenario defined by the AERIS program. The Operational Scenario constitutes four applications that are designed to provide environmental ben...

  20. Administrator Training and Development: Conceptual Model.

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    A conceptual model for an individualized training program for school administrators integrates processes, characteristics, and tasks through theory training and application. Based on an application of contingency theory, it provides a system matching up administrative candidates' needs in three areas (administrative process, administrative…

  1. AERIS - applications for the environment : real-time information synthesis : eco-lanes operational scenario modeling report.

    DOT National Transportation Integrated Search

    2014-12-01

    This report constitutes the detailed modeling and evaluation results of the Eco-Lanes Operational Scenario defined by the Applications for the Environment: Real-Time Information Synthesis (AERIS) Program. The Operational Scenario constitutes six appl...

  2. A Web Application For Visualizing Empirical Models of the Space-Atmosphere Interface Region: AtModWeb

    NASA Astrophysics Data System (ADS)

    Knipp, D.; Kilcommons, L. M.; Damas, M. C.

    2015-12-01

    We have created a simple and user-friendly web application to visualize output from empirical atmospheric models that describe the lower atmosphere and the Space-Atmosphere Interface Region (SAIR). The Atmospheric Model Web Explorer (AtModWeb) is a lightweight, multi-user, Python-driven application which uses standard web technology (jQuery, HTML5, CSS3) to give an in-browser interface that can produce plots of modeled quantities such as temperature and individual species and total densities of neutral and ionized upper-atmosphere. Output may be displayed as: 1) a contour plot over a map projection, 2) a pseudo-color plot (heatmap) which allows visualization of a variable as a function of two spatial coordinates, or 3) a simple line plot of one spatial coordinate versus any number of desired model output variables. The application is designed around an abstraction of an empirical atmospheric model, essentially treating the model code as a black box, which makes it simple to add additional models without modifying the main body of the application. Currently implemented are the Naval Research Laboratory NRLMSISE00 model for neutral atmosphere and the International Reference Ionosphere (IRI). These models are relevant to the Low Earth Orbit environment and the SAIR. The interface is simple and usable, allowing users (students and experts) to specify time and location, and choose between historical (i.e. the values for the given date) or manual specification of whichever solar or geomagnetic activity drivers are required by the model. We present a number of use-case examples from research and education: 1) How does atmospheric density between the surface and 1000 km vary with time of day, season and solar cycle?; 2) How do ionospheric layers change with the solar cycle?; 3 How does the composition of the SAIR vary between day and night at a fixed altitude?

  3. Quantitative thermodynamic predication of interactions between nucleic acid and non-nucleic acid species using Microsoft excel.

    PubMed

    Zou, Jiaqi; Li, Na

    2013-09-01

    Proper design of nucleic acid sequences is crucial for many applications. We have previously established a thermodynamics-based quantitative model to help design aptamer-based nucleic acid probes by predicting equilibrium concentrations of all interacting species. To facilitate customization of this thermodynamic model for different applications, here we present a generic and easy-to-use platform to implement the algorithm of the model with Microsoft(®) Excel formulas and VBA (Visual Basic for Applications) macros. Two Excel spreadsheets have been developed: one for the applications involving only nucleic acid species, the other for the applications involving both nucleic acid and non-nucleic acid species. The spreadsheets take the nucleic acid sequences and the initial concentrations of all species as input, guide the user to retrieve the necessary thermodynamic constants, and finally calculate equilibrium concentrations for all species in various bound and unbound conformations. The validity of both spreadsheets has been verified by comparing the modeling results with the experimental results on nucleic acid sequences reported in the literature. This Excel-based platform described here will allow biomedical researchers to rationalize the sequence design of nucleic acid probes using the thermodynamics-based modeling even without relevant theoretical and computational skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. A topological framework for interactive queries on 3D models in the Web.

    PubMed

    Figueiredo, Mauro; Rodrigues, José I; Silvestre, Ivo; Veiga-Pires, Cristina

    2014-01-01

    Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications.

  5. A Topological Framework for Interactive Queries on 3D Models in the Web

    PubMed Central

    Figueiredo, Mauro; Rodrigues, José I.; Silvestre, Ivo; Veiga-Pires, Cristina

    2014-01-01

    Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications. PMID:24977236

  6. Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience

    USGS Publications Warehouse

    Hooper, R.P.

    2001-01-01

    A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.

  7. The effect of nitrogen loading on on-site system design: a model for determining land application area size.

    PubMed

    McCardell, A; Davison, L; Edwards, A

    2005-01-01

    Designers of on-site wastewater management systems have six opportunities to remove pollutants of concern from the aqueous waste stream before it reaches ground or surface waters. These opportunities occur at source, at point of collection (primary treatment), secondary treatment, tertiary treatment, land application and buffers. This paper presents a computer based model for the sizing of on-site system land application areas applicable to the Lismore area in Northern New South Wales, a region of high rainfall. Inputs to the model include daily climatic data, soil type, number of people loading the system and size of housing allotment. Constraints include allowable phosphorus export, nitrogen export and hydraulic percolation. In the Lismore area nitrogen is the nutrient of most concern. In areas close to environmentally sensitive waterways, and in dense developments, the allowable annual nitrogen export becomes the main factor determining the land application area size. The model offers system designers the opportunity to test various combinations of nitrogen attenuation strategies (source control, secondary treatment) in order to create a solution which offers an acceptable nitrogen export rate while meeting the client's household and financial needs. The model runs on an Excel spreadsheet and has been developed by Lismore City Council.

  8. Application of an IRT Polytomous Model for Measuring Health Related Quality of Life

    ERIC Educational Resources Information Center

    Tejada, Antonio J. Rojas; Rojas, Oscar M. Lozano

    2005-01-01

    Background: The Item Response Theory (IRT) has advantages for measuring Health Related Quality of Life (HRQOL) as opposed to the Classical Tests Theory (CTT). Objectives: To present the results of the application of a polytomous model based on IRT, specifically, the Rating Scale Model (RSM), to measure HRQOL with the EORTC QLQ-C30. Methods: 103…

  9. A synthesis of literature on evaluation of models for policy applications, with implications for forest carbon accounting

    Treesearch

    Stephen P. Prisley; Michael J. Mortimer

    2004-01-01

    Forest modeling has moved beyond the realm of scientific discovery into the policy arena. The example that motivates this review is the application of models for forest carbon accounting. As negotiations determine the terms under which forest carbon will be accounted, reported, and potentially traded, guidelines and standards are being developed to ensure consistency,...

  10. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

  11. First-order fire effects models for land Management: Overview and issues

    Treesearch

    Elizabeth D. Reinhardt; Matthew B. Dickinson

    2010-01-01

    We give an overview of the science application process at work in supporting fire management. First-order fire effects models, such as those discussed in accompanying papers, are the building blocks of software systems designed for application to landscapes over time scales from days to centuries. Fire effects may be modeled using empirical, rule based, or process...

  12. An Application of the PMI Model at the Project Level Evaluation of ESEA Title IV-C Projects.

    ERIC Educational Resources Information Center

    McBeath, Marcia

    All of the papers presented as part of a symposium concerned the application of the Planning, Monitoring, and Implementation Model (PMI) to the evaluation of the District of Columbia Public Schools' programs supported by the Elementary Secondary Education Act (ESEA) Title IV-C. PMI was developed to provide a model for systematic evaluation of…

  13. Applications of Multidimensional Item Response Theory Models with Covariates to Longitudinal Test Data. Research Report. ETS RR-16-21

    ERIC Educational Resources Information Center

    Fu, Jianbin

    2016-01-01

    The multidimensional item response theory (MIRT) models with covariates proposed by Haberman and implemented in the "mirt" program provide a flexible way to analyze data based on item response theory. In this report, we discuss applications of the MIRT models with covariates to longitudinal test data to measure skill differences at the…

  14. Latent Trait Theory in the Affective Domain--Applications of the Rasch Model.

    ERIC Educational Resources Information Center

    Curry, Allen R.; Riegel, N. Blyth

    The Rasch model of test theory is described in general terms, compared with latent trait theory, and shown to have interesting applications for the measurement of affective as well as cognitive traits. Three assumption of the Rasch model are stated to support the conclusion that calibration of the items and tests is independent of the examinee…

  15. Application of tissue time course data to elucidate mechanistic details of carbon tetrachloride (CC14) transport using an updated physiologically based pharmacokinetic (PBPK) model in rats

    EPA Science Inventory

    CCl4 is a common environmental contaminant in water and superfund sites, and a model liver toxicant. One application of PBPK models used in risk assessment is simulation of internal dose for the metric involved with toxicity, particularly for different routes of exposure. Time-co...

  16. DEVELOPMENT AND REVIEW OF MONITORING METHODS AND RISK ASSESSMENT MODELS USED TO DETERMINE THE EFFECTS OF BIOSOLIDS LAND APPLICATION ON HUMAN HEALTH AND THE ENVIRONMENT

    EPA Science Inventory

    Development and Review of monitoring methods and risk assessment models for biosolids land application impacts on air and land

    Ronald F Herrmann (NRMRL), Mike Broder (NCEA), and Mike Ware (NERL)

    Science Questions .

    MYP Science Question: What additional model...

  17. The Sheperd equation and chaos identification.

    PubMed

    Gregson, Robert A M

    2010-04-01

    An equation created by Sheperd (1982) to model stability in exploited fish populations has been found to have a wider application, and it exhibits complicated internal dynamics, including phases of strict periodicity and of chaos. It may be potentially applicable to other psychophysiological contexts. The problems of determining goodness-of fit, and the comparative performance of alternative models including the Shephed model, are briefly addressed.

  18. Analysis of a Thin Optical Lens Model

    ERIC Educational Resources Information Center

    Ivchenko, Vladimir V.

    2011-01-01

    In this article a thin optical lens model is considered. It is shown that the limits of its applicability are determined not only by the ratio between the thickness of the lens and the modules of the radii of curvature, but above all its geometric type. We have derived the analytical criteria for the applicability of the model for different types…

  19. Development of a GIS interface for WEPP Model application to Great Lakes forested watersheds

    Treesearch

    J. R. Frankenberger; S. Dun; D. C. Flanagan; J. Q. Wu; W. J. Elliot

    2011-01-01

    This presentation will highlight efforts on development of a new online WEPP GIS interface, targeted toward application in forested regions bordering the Great Lakes. The key components and algorithms of the online GIS system will be outlined. The general procedures used to provide input to the WEPP model and to display model output will be demonstrated.

  20. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

Top