Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code
NASA Technical Reports Server (NTRS)
Freeh, Josh
2003-01-01
Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.
Existence of periodic solutions in a model of respiratory syncytial virus RSV
NASA Astrophysics Data System (ADS)
Arenas, Abraham J.; González, Gilberto; Jódar, Lucas
2008-08-01
In this paper we study the existence of a positive periodic solutions for nested models of respiratory syncytial virus RSV, by using a continuation theorem based on coincidence degree theory. Conditions for the existence of periodic solutions in the model are given. Numerical simulations related to the transmission of respiratory syncytial virus in Madrid and Rio Janeiro are included.
NASA Astrophysics Data System (ADS)
Benjanirat, Sarun
Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... airplane Airworthiness Limitation Items (ALI), to include new inspections tasks or modification of existing... (ALI), to include new inspections tasks or modification of existing ones and its respective thresholds... resulted in modifications on the airplane Airworthiness Limitation Items (ALI), to include new inspections...
Theoretical models of parental HIV disclosure: a critical review.
Qiao, Shan; Li, Xiaoming; Stanton, Bonita
2013-01-01
This study critically examined three major theoretical models related to parental HIV disclosure (i.e., the Four-Phase Model [FPM], the Disclosure Decision Making Model [DDMM], and the Disclosure Process Model [DPM]), and the existing studies that could provide empirical support to these models or their components. For each model, we briefly reviewed its theoretical background, described its components and/or mechanisms, and discussed its strengths and limitations. The existing empirical studies supported most theoretical components in these models. However, hypotheses related to the mechanisms proposed in the models have not yet tested due to a lack of empirical evidence. This study also synthesized alternative theoretical perspectives and new issues in disclosure research and clinical practice that may challenge the existing models. The current study underscores the importance of including components related to social and cultural contexts in theoretical frameworks, and calls for more adequately designed empirical studies in order to test and refine existing theories and to develop new ones.
Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.
Tute, Erik; Steiner, Jochen
2018-01-01
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
Modeling the Environmental Impact of Air Traffic Operations
NASA Technical Reports Server (NTRS)
Chen, Neil
2011-01-01
There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
Cryogenic Wind Tunnel Models. Design and Fabrication
NASA Technical Reports Server (NTRS)
Young, C. P., Jr. (Compiler); Gloss, B. B. (Compiler)
1983-01-01
The principal motivating factor was the National Transonic Facility (NTF). Since the NTF can achieve significantly higher Reynolds numbers at transonic speeds than other wind tunnels in the world, and will therefore occupy a unique position among ground test facilities, every effort is being made to ensure that model design and fabrication technology exists to allow researchers to take advantage of this high Reynolds number capability. Since a great deal of experience in designing and fabricating cryogenic wind tunnel models does not exist, and since the experience that does exist is scattered over a number of organizations, there is a need to bring existing experience in these areas together and share it among all interested parties. Representatives from government, the airframe industry, and universities are included.
Numerical Simulation of Liquid Jet Atomization Including Turbulence Effects
NASA Technical Reports Server (NTRS)
Trinh, Huu P.; Chen, C. P.; Balasubramanyam, M. S.
2005-01-01
This paper describes numerical implementation of a newly developed hybrid model, T-blob/T-TAB, into an existing computational fluid dynamics (CFD) program for primary and secondary breakup simulation of liquid jet atomization. This model extend two widely used models, the Kelvin-Helmholtz (KH) instability of Reitz (blob model) and the Taylor-Analogy-Breakup (TAB) secondary droplet breakup by O'Rourke and Amsden to include turbulence effects. In the primary breakup model, the level of the turbulence effect on the liquid breakup depends on the characteristic scales and the initial flow conditions. For the secondary breakup, an additional turbulence force acted on parent drops is modeled and integrated into the TAB governing equation. Several assessment studies are presented and the results indicate that the existing KH and TAB models tend to under-predict the product drop size and spray angle, while the current model provides superior results when compared with the measured data.
SMP: A solid modeling program version 2.0
NASA Technical Reports Server (NTRS)
Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.
1986-01-01
The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, E.J.; McNeilly, G.S.
The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.
Propagation Effects of Importance to the NASA/JPL Deep Space Network (DSN)
NASA Technical Reports Server (NTRS)
Slobin, Steve
1999-01-01
This paper presents Propagation Effects of Importance To The NASA/JPL Deep Space Network (DSN). The topics include: 1) DSN Antennas; 2) Deep Space Telecom Link Basics; 3) DSN Propagation Region of Interest; 4) Ka-Band Weather Effects Models and Examples; 5) Existing Goldstone Ka-Band Atmosphere Attenuation Model; 6) Existing Goldstone Atmosphere Noise Temperature Model; and 7) Ka-Band delta (G/T) Relative to Vacuum Condition. This paper summarizes the topics above.
McLeod, Melissa; Blakely, Tony; Kvizhinadze, Giorgi; Harris, Ricci
2014-01-01
A critical first step toward incorporating equity into cost-effectiveness analyses is to appropriately model interventions by population subgroups. In this paper we use a standardized treatment intervention to examine the impact of using ethnic-specific (Māori and non-Māori) data in cost-utility analyses for three cancers. We estimate gains in health-adjusted life years (HALYs) for a simple intervention (20% reduction in excess cancer mortality) for lung, female breast, and colon cancers, using Markov modeling. Base models include ethnic-specific cancer incidence with other parameters either turned off or set to non-Māori levels for both groups. Subsequent models add ethnic-specific cancer survival, morbidity, and life expectancy. Costs include intervention and downstream health system costs. For the three cancers, including existing inequalities in background parameters (population mortality and comorbidities) for Māori attributes less value to a year of life saved compared to non-Māori and lowers the relative health gains for Māori. In contrast, ethnic inequalities in cancer parameters have less predictable effects. Despite Māori having higher excess mortality from all three cancers, modeled health gains for Māori were less from the lung cancer intervention than for non-Māori but higher for the breast and colon interventions. Cost-effectiveness modeling is a useful tool in the prioritization of health services. But there are important (and sometimes counterintuitive) implications of including ethnic-specific background and disease parameters. In order to avoid perpetuating existing ethnic inequalities in health, such analyses should be undertaken with care.
2014-01-01
Background A critical first step toward incorporating equity into cost-effectiveness analyses is to appropriately model interventions by population subgroups. In this paper we use a standardized treatment intervention to examine the impact of using ethnic-specific (Māori and non-Māori) data in cost-utility analyses for three cancers. Methods We estimate gains in health-adjusted life years (HALYs) for a simple intervention (20% reduction in excess cancer mortality) for lung, female breast, and colon cancers, using Markov modeling. Base models include ethnic-specific cancer incidence with other parameters either turned off or set to non-Māori levels for both groups. Subsequent models add ethnic-specific cancer survival, morbidity, and life expectancy. Costs include intervention and downstream health system costs. Results For the three cancers, including existing inequalities in background parameters (population mortality and comorbidities) for Māori attributes less value to a year of life saved compared to non-Māori and lowers the relative health gains for Māori. In contrast, ethnic inequalities in cancer parameters have less predictable effects. Despite Māori having higher excess mortality from all three cancers, modeled health gains for Māori were less from the lung cancer intervention than for non-Māori but higher for the breast and colon interventions. Conclusions Cost-effectiveness modeling is a useful tool in the prioritization of health services. But there are important (and sometimes counterintuitive) implications of including ethnic-specific background and disease parameters. In order to avoid perpetuating existing ethnic inequalities in health, such analyses should be undertaken with care. PMID:24910540
NASA Astrophysics Data System (ADS)
Suryanto, Agus; Darti, Isnani
2017-12-01
In this paper we discuss a fractional order predator-prey model with ratio-dependent functional response. The dynamical properties of this model is analyzed. Here we determine all equilibrium points of this model including their existence conditions and their stability properties. It is found that the model has two type of equilibria, namely the predator-free point and the co-existence point. If there is no co-existence equilibrium, i.e. when the coefficient of conversion from the functional response into the growth rate of predator is less than the death rate of predator, then the predator-free point is asymptotically stable. On the other hand, if the co-existence point exists then this equilibrium is conditionally stable. We also construct a nonstandard Grnwald-Letnikov (NSGL) numerical scheme for the propose model. This scheme is a combination of the Grnwald-Letnikov approximation and the nonstandard finite difference scheme. This scheme is implemented in MATLAB and used to perform some simulations. It is shown that our numerical solutions are consistent with the dynamical properties of our fractional predator-prey model.
ERIC Educational Resources Information Center
Institute for Local Self Government, Berkeley, CA.
To meet the manpower needs of local governments, the model developed for this project redirects national and technical education toward new careers programs. Designed by task forces of professional personnel, the model utilizes existing local government resources, including funds for new career activities. Accomplishments of the project include:…
Prognosis model for stand development
Albert R. Stage
1973-01-01
Describes a set of computer programs for developing prognoses of the development of existing stand under alternative regimes of management. Calibration techniques, modeling procedures, and a procedure for including stochastic variation are described. Implementation of the system for lodgepole pine, including assessment of losses attributed to an infestation of mountain...
Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J
2017-08-04
There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Decentralized control of Markovian decision processes: Existence Sigma-admissable policies
NASA Technical Reports Server (NTRS)
Greenland, A.
1980-01-01
The problem of formulating and analyzing Markov decision models having decentralized information and decision patterns is examined. Included are basic examples as well as the mathematical preliminaries needed to understand Markov decision models and, further, to superimpose decentralized decision structures on them. The notion of a variance admissible policy for the model is introduced and it is proved that there exist (possibly nondeterministic) optional policies from the class of variance admissible policies. Directions for further research are explored.
NASA Astrophysics Data System (ADS)
Helmuth, Douglas B.; Bell, Raymond M.; Grant, David A.; Lentz, Christopher A.
2012-09-01
Architecting the operational Next Generation of earth monitoring satellites based on matured climate modeling, reuse of existing sensor & satellite capabilities, attention to affordability and evolutionary improvements integrated with constellation efficiencies - becomes our collective goal for an open architectural design forum. Understanding the earth's climate and collecting requisite signatures over the next 30 years is a shared mandate by many of the world's governments. But there remains a daunting challenge to bridge scientific missions to 'operational' systems that truly support the demands of decision makers, scientific investigators and global users' requirements for trusted data. In this paper we will suggest an architectural structure that takes advantage of current earth modeling examples including cross-model verification and a first order set of critical climate parameters and metrics; that in turn, are matched up with existing space borne collection capabilities and sensors. The tools used and the frameworks offered are designed to allow collaborative overlays by other stakeholders nominating different critical parameters and their own treaded connections to existing international collection experience. These aggregate design suggestions will be held up to group review and prioritized as potential constellation solutions including incremental and spiral developments - including cost benefits and organizational opportunities. This Part IV effort is focused on being an inclusive 'Next Gen Constellation' design discussion and is the natural extension to earlier papers.
Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons
Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...
FARSITE: Fire Area Simulator-model development and evaluation
Mark A. Finney
1998-01-01
A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Craig
Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Adopting Internet Standards for Orbital Use
NASA Technical Reports Server (NTRS)
Wood, Lloyd; Ivancic, William; da Silva Curiel, Alex; Jackson, Chris; Stewart, Dave; Shell, Dave; Hodgson, Dave
2005-01-01
After a year of testing and demonstrating a Cisco mobile access router intended for terrestrial use onboard the low-Earth-orbiting UK-DMC satellite as part of a larger merged ground/space IP-based internetwork, we reflect on and discuss the benefits and drawbacks of integration and standards reuse for small satellite missions. Benefits include ease of operation and the ability to leverage existing systems and infrastructure designed for general use, as well as reuse of existing, known, and well-understood security and operational models. Drawbacks include cases where integration work was needed to bridge the gaps in assumptions between different systems, and where performance considerations outweighed the benefits of reuse of pre-existing file transfer protocols. We find similarities with the terrestrial IP networks whose technologies we have adopted and also some significant differences in operational models and assumptions that must be considered.
Beyond the Ego: Toward Transpersonal Models of the Person and Psychotherapy.
ERIC Educational Resources Information Center
Walsh, Roger N.; Vaughan, Frances
1980-01-01
Discusses a transpersonal model which, like a humanistic model, focuses on the human potential for growth, health, and well-being. It goes beyond existing models to include self-transcendence and emphasizes the centrality of consciousness in shaping experience and enhancing well-being. (Author)
Data-driven non-Markovian closure models
NASA Astrophysics Data System (ADS)
Kondrashov, Dmitri; Chekroun, Mickaël D.; Ghil, Michael
2015-03-01
This paper has two interrelated foci: (i) obtaining stable and efficient data-driven closure models by using a multivariate time series of partial observations from a large-dimensional system; and (ii) comparing these closure models with the optimal closures predicted by the Mori-Zwanzig (MZ) formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a generalization and a time-continuous limit of existing multilevel, regression-based approaches to closure in a data-driven setting; these approaches include empirical model reduction (EMR), as well as more recent multi-layer modeling. It is shown that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the MZ formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are derived on the structure of the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a broad class of MSM applications, a class that includes non-polynomial predictors and nonlinearities that do not necessarily preserve quadratic energy invariants. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. It is shown that the resulting closure model with energy-conserving nonlinearities efficiently captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lotka-Volterra model of population dynamics in its chaotic regime. The challenges here include the rarity of strange attractors in the model's parameter space and the existence of multiple attractor basins with fractal boundaries. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up.
Alternative Models for the Co-operative Governance of Teacher Education Programs.
ERIC Educational Resources Information Center
Sagan, Edgar L.; Smith, Barbara G.
This paper reviews and criticizes existing models of governance of teacher education and proposes alternative ones. Chapter I defines three models of governance including a) a bureaucratic model; b) a collaborative model; and c) a systems analysis model which is used to plan new models in the final chapters. Chapter II deals with the current…
ERIC Educational Resources Information Center
Lin, Tony; Erfan, Sasan
2016-01-01
Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…
A Bayesian Semiparametric Latent Variable Model for Mixed Responses
ERIC Educational Resources Information Center
Fahrmeir, Ludwig; Raach, Alexander
2007-01-01
In this paper we introduce a latent variable model (LVM) for mixed ordinal and continuous responses, where covariate effects on the continuous latent variables are modelled through a flexible semiparametric Gaussian regression model. We extend existing LVMs with the usual linear covariate effects by including nonparametric components for nonlinear…
MTBE is a volatile organic compound used as an oxygenate additive to gasoline, added to comply with the 1990 Clean Air Act. Previous PBPK models for MTBE were reviewed and incorporated into the Exposure Related Dose Estimating Model (ERDEM) software. This model also included an e...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Survey of Existing Indian Parenting Programs.
ERIC Educational Resources Information Center
Kellogg, Jeff B., Comp.
This survey of existing American Indian parenting programs is part of a National American Indian Court Judges Association project to design a model process by which social service providers can offer effective parent education programs that are culturally relevant to American Indians. The report also offers profiles, including addresses and names…
A review of methods for predicting air pollution dispersion
NASA Technical Reports Server (NTRS)
Mathis, J. J., Jr.; Grose, W. L.
1973-01-01
Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.
A model for evaluating stream temperature response to climate change scenarios in Wisconsin
Westenbroek, Stephen M.; Stewart, Jana S.; Buchwald, Cheryl A.; Mitro, Matthew G.; Lyons, John D.; Greb, Steven
2010-01-01
Global climate change is expected to alter temperature and flow regimes for streams in Wisconsin over the coming decades. Stream temperature will be influenced not only by the predicted increases in average air temperature, but also by changes in baseflow due to changes in precipitation patterns and amounts. In order to evaluate future stream temperature and flow regimes in Wisconsin, we have integrated two existing models in order to generate a water temperature time series at a regional scale for thousands of stream reaches where site-specific temperature observations do not exist. The approach uses the US Geological Survey (USGS) Soil-Water-Balance (SWB) model, along with a recalibrated version of an existing artificial neural network (ANN) stream temperature model. The ANN model simulates stream temperatures on the basis of landscape variables such as land use and soil type, and also includes climate variables such as air temperature and precipitation amounts. The existing ANN model includes a landscape variable called DARCY designed to reflect the potential for groundwater recharge in the contributing area for a stream segment. SWB tracks soil-moisture and potential recharge at a daily time step, providing a way to link changing climate patterns and precipitation amounts over time to baseflow volumes, and presumably to stream temperatures. The recalibrated ANN incorporates SWB-derived estimates of potential recharge to supplement the static estimates of groundwater flow potential derived from a topographically based model (DARCY). SWB and the recalibrated ANN will be supplied with climate drivers from a suite of general circulation models and emissions scenarios, enabling resource managers to evaluate possible changes in stream temperature regimes for Wisconsin.
2005-12-31
MANPADS missile is modeled using LSDYNA . It has 187600 nodes, 52802 shell elements with 13 shell materials, 112200 solid elements with 1804 solid...model capability that includes impact, detonation, penetration, and wing flutter response. This work extends an existing body on body missile model...the missile as well as the expansion of the surrounding fluids was modeled in the Eulerian domain. The Jones-Wilkins-Lee (JWL) equation of state was
An Eye Model for Computational Dosimetry Using A Multi-Scale Voxel Phantom
NASA Astrophysics Data System (ADS)
Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek
2014-06-01
The lens of the eye is a radiosensitive tissue with cataract formation being the major concern. Recently reduced recommended dose limits to the lens of the eye have made understanding the dose to this tissue of increased importance. Due to memory limitations, the voxel resolution of computational phantoms used for radiation dose calculations is too large to accurately represent the dimensions of the eye. A revised eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and is then transformed into a high-resolution voxel model. This eye model is combined with an existing set of whole body models to form a multi-scale voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.
A microcomputer model for simulating pressurized flow in a storm sewer system : interim report.
DOT National Transportation Integrated Search
1988-01-01
A study is being conducted on the development of a microcomputer model for simulating storm sewer flow under surcharged or pressurized conditions. Several existing models, including the EPA Storm Water Management Hodel (SYMM) and the Illinois Urban D...
Balancing the Roles of a Family Medicine Residency Faculty: A Grounded Theory Study.
Reitz, Randall; Sudano, Laura; Siler, Anne; Trimble, Kristopher
2016-05-01
Great variety exists in the roles that family medicine residency faculty fill in the lives of their residents. A family medicine-specific model has never been created to describe and promote effective training relationships. This research aims to create a consensus model for faculty development, ethics education, and policy creation. Using a modified grounded theory methods, researchers conducted phone interviews with 22 key informants from US family medicine residencies. Data were analyzed to delineate faculty roles, common role conflicts, and ethical principles for avoiding and managing role conflicts. Key informants were asked to apply their experience and preferences to adapt an existing model to fit with family medicine residency settings. The primary result of this research is the creation of a family medicine-specific model that describes faculty roles and provides insight into how to manage role conflicts with residents. Primary faculty roles include Role Model, Advisor, Teacher, Supervisor, and Evaluator. Secondary faculty roles include Friendly Colleague, Wellness Supporter, and Helping Hand. The secondary roles exist on a continuum from disengaged to enmeshed. When not balanced, the secondary roles can detract from the primary roles. Differences were found between role expectations of physician versus behavioral science faculty and larger/university/urban residencies versus smaller/community/rural residencies. Diversity of opinion exists related to the types of roles that are appropriate for family medicine faculty to maintain with residents. This new model is a first attempt to build consensus in the field and has application to faculty development, ethics education, and policy creation.
Study of solid state photomultiplier
NASA Technical Reports Server (NTRS)
Hays, K. M.; Laviolette, R. A.
1987-01-01
Available solid state photomultiplier (SSPM) detectors were tested under low-background, low temperature conditions to determine the conditions producing optimal sensitivity in a space-based astronomy system such as a liquid cooled helium telescope in orbit. Detector temperatures varied between 6 and 9 K, with background flux ranging from 10 to the 13th power to less than 10 to the 6th power photons/square cm-s. Measured parameters included quantum efficiency, noise, dark current, and spectral response. Experimental data were reduced, analyzed, and combined with existing data to build the SSPM data base included herein. The results were compared to analytical models of SSPM performance where appropriate models existed. Analytical models presented here were developed to be as consistent with the data base as practicable. Significant differences between the theory and data are described. Some models were developed or updated as a result of this study.
Python scripting in the nengo simulator.
Stewart, Terrence C; Tripp, Bryan; Eliasmith, Chris
2009-01-01
Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models.
Python Scripting in the Nengo Simulator
Stewart, Terrence C.; Tripp, Bryan; Eliasmith, Chris
2008-01-01
Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models. PMID:19352442
Ngwa, Gideon A; Teboh-Ewungkem, Miranda I
2016-01-01
A deterministic ordinary differential equation model for the dynamics and spread of Ebola Virus Disease is derived and studied. The model contains quarantine and nonquarantine states and can be used to evaluate transmission both in treatment centres and in the community. Possible sources of exposure to infection, including cadavers of Ebola Virus victims, are included in the model derivation and analysis. Our model's results show that there exists a threshold parameter, R 0, with the property that when its value is above unity, an endemic equilibrium exists whose value and size are determined by the size of this threshold parameter, and when its value is less than unity, the infection does not spread into the community. The equilibrium state, when it exists, is locally and asymptotically stable with oscillatory returns to the equilibrium point. The basic reproduction number, R 0, is shown to be strongly dependent on the initial response of the emergency services to suspected cases of Ebola infection. When intervention measures such as quarantining are instituted fully at the beginning, the value of the reproduction number reduces and any further infections can only occur at the treatment centres. Effective control measures, to reduce R 0 to values below unity, are discussed.
Ngwa, Gideon A.
2016-01-01
A deterministic ordinary differential equation model for the dynamics and spread of Ebola Virus Disease is derived and studied. The model contains quarantine and nonquarantine states and can be used to evaluate transmission both in treatment centres and in the community. Possible sources of exposure to infection, including cadavers of Ebola Virus victims, are included in the model derivation and analysis. Our model's results show that there exists a threshold parameter, R 0, with the property that when its value is above unity, an endemic equilibrium exists whose value and size are determined by the size of this threshold parameter, and when its value is less than unity, the infection does not spread into the community. The equilibrium state, when it exists, is locally and asymptotically stable with oscillatory returns to the equilibrium point. The basic reproduction number, R 0, is shown to be strongly dependent on the initial response of the emergency services to suspected cases of Ebola infection. When intervention measures such as quarantining are instituted fully at the beginning, the value of the reproduction number reduces and any further infections can only occur at the treatment centres. Effective control measures, to reduce R 0 to values below unity, are discussed. PMID:27579053
A nationwide survey of patient centered medical home demonstration projects.
Bitton, Asaf; Martin, Carina; Landon, Bruce E
2010-06-01
The patient centered medical home has received considerable attention as a potential way to improve primary care quality and limit cost growth. Little information exists that systematically compares PCMH pilot projects across the country. Cross-sectional key-informant interviews. Leaders from existing PCMH demonstration projects with external payment reform. We used a semi-structured interview tool with the following domains: project history, organization and participants, practice requirements and selection process, medical home recognition, payment structure, practice transformation, and evaluation design. A total of 26 demonstrations in 18 states were interviewed. Current demonstrations include over 14,000 physicians caring for nearly 5 million patients. A majority of demonstrations are single payer, and most utilize a three component payment model (traditional fee for service, per person per month fixed payments, and bonus performance payments). The median incremental revenue per physician per year was $22,834 (range $720 to $91,146). Two major practice transformation models were identified--consultative and implementation of the chronic care model. A majority of demonstrations did not have well-developed evaluation plans. Current PCMH demonstration projects with external payment reform include large numbers of patients and physicians as well as a wide spectrum of implementation models. Key questions exist around the adequacy of current payment mechanisms and evaluation plans as public and policy interest in the PCMH model grows.
78 FR 60019 - General Motors, LLC, Receipt of Petition for Decision of Inconsequential Noncompliance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
... comments received will be posted without change to http://www.regulations.gov , including any personal... controlled at the time it determined that the noncompliance existed. \\2\\ GM's petition, which was filed under... noncompliance existed. II. Vehicles Involved Affected are approximately 23,910 model year 2013 Chevrolet Malibu...
Reduced-Order Modeling of Unsteady Aerodynamics Across Multiple Mach Regimes
2013-01-01
including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing ...Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.3 Thesis Overview... Review Aeroelasticity, which is the study of the interaction between fluids and structures when a feedback mechanism exists between the fluid and the
Green Infrastructure Models and Tools
The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...
A Dynamic Stimulus-Driven Model of Signal Detection
ERIC Educational Resources Information Center
Turner, Brandon M.; Van Zandt, Trisha; Brown, Scott
2011-01-01
Signal detection theory forms the core of many current models of cognition, including memory, choice, and categorization. However, the classic signal detection model presumes the a priori existence of fixed stimulus representations--usually Gaussian distributions--even when the observer has no experience with the task. Furthermore, the classic…
A Framework for Understanding Physics Students' Computational Modeling Practices
ERIC Educational Resources Information Center
Lunk, Brandon Robert
2012-01-01
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…
Plant leaf traits, canopy processes, and global atmospheric chemistry interactions.
NASA Astrophysics Data System (ADS)
Guenther, A. B.
2017-12-01
Plants produce and emit a diverse array of volatile metabolites into the atmosphere that participate in chemical reactions that influence distributions of air pollutants and short-lived climate forcers including organic aerosol, ozone and methane. It is now widely accepted that accurate estimates of these emissions are required as inputs for regional air quality and global climate models. Predicting these emissions is complicated by the large number of volatile organic compounds, driving variables (e.g., temperature, solar radiation, abiotic and biotic stresses) and processes operating across a range of scales. Modeling efforts to characterize emission magnitude and variations will be described along with an assessment of the observations available for parameterizing and evaluating these models including discussion of the limitations and challenges associated with existing model approaches. A new approach for simulating canopy scale organic emissions on regional to global scales will be described and compared with leaf, canopy and regional scale flux measurements. The importance of including additional compounds and processes as well as improving estimates of existing ones will also be discussed.
Wakefield, Claire E.
2013-01-01
Adolescents and young adults (AYAs) with cancer must simultaneously navigate the challenges associated with their cancer experience, whilst striving to achieve a number of important developmental milestones at the cusp of adulthood. The disruption caused by their cancer experience at this critical life-stage is assumed to be responsible for significant distress among AYAs living with cancer. The quality and severity of psychological outcomes among AYAs remain poorly documented, however. This review examined the existing literature on psychological outcomes among AYAs living with cancer. All psychological outcomes (both distress and positive adjustment) were included, and AYAs were included across the cancer trajectory, ranging from newly-diagnosed patients, to long-term cancer survivors. Four key research questions were addressed. Section 1 answered the question, “What is the nature and prevalence of distress (and other psychological outcomes) among AYAs living with cancer?” and documented rates of clinical distress, as well as evidence for the trajectory of this distress over time. Section 2 examined the individual, cancer/treatment-related and socio-demographic factors that have been identified as predictors of these outcomes in this existing literature. Section 3 examined current theoretical models relevant to explaining psychological outcomes among AYAs, including developmental models, socio-cognitive and family-systems models, stress-coping frameworks, and cognitive appraisal models (including trauma and meaning making models). The mechanisms implicated in each model were discussed, as was the existing evidence for each model. Converging evidence implicating the potential role of autobiographical memory and future thinking systems in how AYAs process and integrate their cancer experience into their current sense of self and future goals are highlighted. Finally, Section 4 addressed the future of psycho-oncology in understanding and conceptualizing psychological outcomes among AYAs living with cancer, by discussing recent empirical advancements in adjacent, non-oncology fields that might improve our understanding of psychological outcomes in AYAs living with cancer. Included in these were models of memory and future thinking drawn from the broader psychology literature that identify important mechanisms involved in adjustment, as well as experimental paradigms for the study of these mechanisms within analogue, non-cancer AYA samples. PMID:26835313
Study of tethered satellite active attitude control
NASA Technical Reports Server (NTRS)
Colombo, G.
1982-01-01
Existing software was adapted for the study of tethered subsatellite rotational dynamics, an analytic solution for a stable configuration of a tethered subsatellite was developed, the analytic and numerical integrator (computer) solutions for this "test case' was compared in a two mass tether model program (DUMBEL), the existing multiple mass tether model (SKYHOOK) was modified to include subsatellite rotational dynamics, the analytic "test case,' was verified, and the use of the SKYHOOK rotational dynamics capability with a computer run showing the effect of a single off axis thruster on the behavior of the subsatellite was demonstrated. Subroutines for specific attitude control systems are developed and applied to the study of the behavior of the tethered subsatellite under realistic on orbit conditions. The effect of all tether "inputs,' including pendular oscillations, air drag, and electrodynamic interactions, on the dynamic behavior of the tether are included.
Toward Best Practice: An Analysis of the Efficacy of Curriculum Models in Gifted Education
ERIC Educational Resources Information Center
VanTassel-Baska, Joyce; Brown, Elissa F.
2007-01-01
This article provides an overview of existing research on 11 curriculum models in the field of gifted education, including the schoolwide enrichment model and the talent search model, and several others that have been used to shape high-level learning experiences for gifted students. The models are critiqued according to the key features they…
[Modeling in value-based medicine].
Neubauer, A S; Hirneiss, C; Kampik, A
2010-03-01
Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.
Combustion of Nitramine Propellants
1983-03-01
through development of a comprehensive analytical model. The ultimate goals are to enable prediction of deflagration rate over a wide pressure range...superior in burn rate prediction , both simple models fail in correlating existing temperature- sensitivity data. (2) In the second part, a...auxiliary condition to enable independent burn rate prediction ; improved melt phase model including decomposition-gas bubbles; model for far-field
NASA Technical Reports Server (NTRS)
Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)
2001-01-01
Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2017-08-01
The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.
ERIC Educational Resources Information Center
Slayter, Elspeth M.
2017-01-01
Existing research suggests a majority of faculty include social justice content in research courses but not through the use of existing quantitative data for in-class activities that foster mastery of data analysis and interpretation and curiosity about social justice-related topics. By modeling data-driven dialogue and the deconstruction of…
Choe, Hyeyeong; Thorne, James H; Huber, Patrick R; Lee, Dongkun; Quinn, James F
2018-01-01
Protected areas (PAs) are often considered the most important biodiversity conservation areas in national plans, but PAs often do not represent national-scale biodiversity. We evaluate the current conservation status of plant biodiversity within current existing PAs, and identify potential additional PAs for South Korea. We modeled species ranges for 2,297 plant species using Multivariate Adaptive Regression Splines and compared the level of mean range representation in South Korea's existing PAs, which comprise 5.7% of the country's mainland area, with an equal-area alternative PA strategy selected with the reserve algorithm Marxan. We also used Marxan to model two additional conservation scenarios that add lands to approach the Aichi Biodiversity Target objectives (17% of the country). Existing PAs in South Korea contain an average of 6.3% of each plant species' range, compared to 5.9% in the modeled equal-area alternative. However, existing PAs primarily represent a high percentage of the ranges for high-elevation and small range size species. The additional PAs scenario that adds lands to the existing PAs covers 14,587.55 km2, and would improve overall plant range representation to a mean of 16.8% of every species' range. The other additional PAs scenario, which selects new PAs from all lands and covers 13,197.35 km2, would improve overall plant range representation to a mean of 13.5%. Even though the additional PAs that includes existing PAs represents higher percentages of species' ranges, it is missing many biodiversity hotspots in non-mountainous areas and the additional PAs without locking in the existing PAs represent almost all species' ranges evenly, including low-elevation ones with larger ranges. Some priority conservation areas we identified are expansions of, or near, existing PAs, especially in northeastern and southern South Korea. However, lowland coastal areas and areas surrounding the capital city, Seoul, are also critical for biodiversity conservation in South Korea.
Thorne, James H.; Huber, Patrick R.; Lee, Dongkun; Quinn, James F.
2018-01-01
Protected areas (PAs) are often considered the most important biodiversity conservation areas in national plans, but PAs often do not represent national-scale biodiversity. We evaluate the current conservation status of plant biodiversity within current existing PAs, and identify potential additional PAs for South Korea. We modeled species ranges for 2,297 plant species using Multivariate Adaptive Regression Splines and compared the level of mean range representation in South Korea’s existing PAs, which comprise 5.7% of the country’s mainland area, with an equal-area alternative PA strategy selected with the reserve algorithm Marxan. We also used Marxan to model two additional conservation scenarios that add lands to approach the Aichi Biodiversity Target objectives (17% of the country). Existing PAs in South Korea contain an average of 6.3% of each plant species’ range, compared to 5.9% in the modeled equal-area alternative. However, existing PAs primarily represent a high percentage of the ranges for high-elevation and small range size species. The additional PAs scenario that adds lands to the existing PAs covers 14,587.55 km2, and would improve overall plant range representation to a mean of 16.8% of every species’ range. The other additional PAs scenario, which selects new PAs from all lands and covers 13,197.35 km2, would improve overall plant range representation to a mean of 13.5%. Even though the additional PAs that includes existing PAs represents higher percentages of species’ ranges, it is missing many biodiversity hotspots in non-mountainous areas and the additional PAs without locking in the existing PAs represent almost all species’ ranges evenly, including low-elevation ones with larger ranges. Some priority conservation areas we identified are expansions of, or near, existing PAs, especially in northeastern and southern South Korea. However, lowland coastal areas and areas surrounding the capital city, Seoul, are also critical for biodiversity conservation in South Korea. PMID:29474355
Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry
NASA Astrophysics Data System (ADS)
Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek
2014-09-01
Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.
ERIC Educational Resources Information Center
Dori, Yehudit Judy; Kaberman, Zvia
2012-01-01
Much knowledge in chemistry exists at a molecular level, inaccessible to direct perception. Chemistry instruction should therefore include multiple visual representations, such as molecular models and symbols. This study describes the implementation and assessment of a learning unit designed for 12th grade chemistry honors students. The organic…
Modeling carbon and nitrogen biogeochemistry in forest ecosystems
Changsheng Li; Carl Trettin; Ge Sun; Steve McNulty; Klaus Butterbach-Bahl
2005-01-01
A forest biogeochemical model, Forest-DNDC, was developed to quantify carbon sequestration in and trace gas emissions from forest ecosystems. Forest-DNDC was constructed by integrating two existing moels, PnET and DNDC, with several new features including nitrification, forest litter layer, soil freezing and thawing etc, PnET is a forest physiological model predicting...
Wavefronts for a global reaction-diffusion population model with infinite distributed delay
NASA Astrophysics Data System (ADS)
Weng, Peixuan; Xu, Zhiting
2008-09-01
We consider a global reaction-diffusion population model with infinite distributed delay which includes models of Nicholson's blowflies and hematopoiesis derived by Gurney, Mackey and Glass, respectively. The existence of monotone wavefronts is derived by using the abstract settings of functional differential equations and Schauder fixed point theory.
Carcinogenicity and Mutagenicity Data: New Initiatives to ...
Currents models for prediction of chemical carcinogenicity and mutagenicity rely upon a relatively small number of publicly available data resources, where the data being modeled are highly summarized and aggregated representations of the actual experimental results. A number of new initiatives are underway to improve access to existing public carcinogenicity and mutagenicity data for use in modeling, as well as to encourage new approaches to the use of data in modeling. Rodent bioassay results from the NIEHS National Toxicology Program (NTP) and the Berkeley Carcinogenic Potency Database (CPDB) have provided the largest public data resources for building carcinogenicity prediction models to date. However, relatively few and limited representations of these data have actually informed existing models. Initiatives, such as EPA's DSSTox Database Network, offer elaborated and quality reviewed presentations of the CPDB and expanded data linkages and coverage of chemical space for carcinogenicity and mutagenicity. In particular the latest published DSSTox CPDBAS structure-data file includes a number of species-specific and summary activity fields, including a species-specific normalized score for carcinogenic potency (TD50) and various weighted summary activities. These data are being incorporated into PubChem to provide broad
Urban topography for flood modeling by fusion of OpenStreetMap, SRTM and local knowledge
NASA Astrophysics Data System (ADS)
Winsemius, Hessel; Donchyts, Gennadii; Eilander, Dirk; Chen, Jorik; Leskens, Anne; Coughlan, Erin; Mawanda, Shaban; Ward, Philip; Diaz Loaiza, Andres; Luo, Tianyi; Iceland, Charles
2016-04-01
Topography data is essential for understanding and modeling of urban flood hazard. Within urban areas, much of the topography is defined by highly localized man-made features such as roads, channels, ditches, culverts and buildings. This results in the requirement that urban flood models require high resolution topography, and water conveying connections within the topography are considered. In recent years, more and more topography information is collected through LIDAR surveys however there are still many cities in the world where high resolution topography data is not available. Furthermore, information on connectivity is required for flood modelling, even when LIDAR data are used. In this contribution, we demonstrate how high resolution terrain data can be synthesized using a fusion between features in OpenStreetMap (OSM) data (including roads, culverts, channels and buildings) and existing low resolution and noisy SRTM elevation data using the Google Earth Engine platform. Our method uses typical existing OSM properties to estimate heights and topology associated with the features, and uses these to correct noise and burn features on top of the existing low resolution SRTM elevation data. The method has been setup in the Google Earth Engine platform so that local stakeholders and mapping teams can on-the-fly propose, include and visualize the effect of additional features and properties of features, which are deemed important for topography and water conveyance. These features can be included in a workshop environment. We pilot our tool over Dar Es Salaam.
Operating a terrestrial Internet router onboard and alongside a small satellite
NASA Astrophysics Data System (ADS)
Wood, L.; da Silva Curiel, A.; Ivancic, W.; Hodgson, D.; Shell, D.; Jackson, C.; Stewart, D.
2006-07-01
After twenty months of flying, testing and demonstrating a Cisco mobile access router, originally designed for terrestrial use, onboard the low-Earth-orbiting UK-DMC satellite as part of a larger merged ground/space IP-based internetwork, we use our experience to examine the benefits and drawbacks of integration and standards reuse for small satellite missions. Benefits include ease of operation and the ability to leverage existing systems and infrastructure designed for general use with a large set of latent capabilities to draw on when needed, as well as the familiarity that comes from reuse of existing, known, and well-understood security and operational models. Drawbacks include cases where integration work was needed to bridge the gaps in assumptions between different systems, and where performance considerations outweighed the benefits of reuse of pre-existing file transfer protocols. We find similarities with the terrestrial IP networks whose technologies have been taken to small satellites—and also some significant differences between the two in operational models and assumptions that must be borne in mind.
NASA geometry data exchange specification for computational fluid dynamics (NASA IGES)
NASA Technical Reports Server (NTRS)
Blake, Matthew W.; Kerr, Patricia A.; Thorp, Scott A.; Jou, Jin J.
1994-01-01
This document specifies a subset of an existing product data exchange specification that is widely used in industry and government. The existing document is called the Initial Graphics Exchange Specification. This document, a subset of IGES, is intended for engineers analyzing product performance using tools such as computational fluid dynamics (CFD) software. This document specifies how to define mathematically and exchange the geometric model of an object. The geometry is represented utilizing nonuniform rational B-splines (NURBS) curves and surfaces. Only surface models are represented; no solid model representation is included. This specification does not include most of the other types of product information available in IGES (e.g., no material properties or surface finish properties) and does not provide all the specific file format details of IGES. The data exchange protocol specified in this document is fully conforming to the American National Standard (ANSI) IGES 5.2.
Explore or Exploit? A Generic Model and an Exactly Solvable Case
NASA Astrophysics Data System (ADS)
Gueudré, Thomas; Dobrinevski, Alexander; Bouchaud, Jean-Philippe
2014-02-01
Finding a good compromise between the exploitation of known resources and the exploration of unknown, but potentially more profitable choices, is a general problem, which arises in many different scientific disciplines. We propose a stylized model for these exploration-exploitation situations, including population or economic growth, portfolio optimization, evolutionary dynamics, or the problem of optimal pinning of vortices or dislocations in disordered materials. We find the exact growth rate of this model for treelike geometries and prove the existence of an optimal migration rate in this case. Numerical simulations in the one-dimensional case confirm the generic existence of an optimum.
Explore or exploit? A generic model and an exactly solvable case.
Gueudré, Thomas; Dobrinevski, Alexander; Bouchaud, Jean-Philippe
2014-02-07
Finding a good compromise between the exploitation of known resources and the exploration of unknown, but potentially more profitable choices, is a general problem, which arises in many different scientific disciplines. We propose a stylized model for these exploration-exploitation situations, including population or economic growth, portfolio optimization, evolutionary dynamics, or the problem of optimal pinning of vortices or dislocations in disordered materials. We find the exact growth rate of this model for treelike geometries and prove the existence of an optimal migration rate in this case. Numerical simulations in the one-dimensional case confirm the generic existence of an optimum.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Atmospheric prediction model survey
NASA Technical Reports Server (NTRS)
Wellck, R. E.
1976-01-01
As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... cracks resulted in modifications on the airplane Airworthiness Limitation Items (ALI), to include new... Limitation Items (ALI), to include new inspections tasks or modification of existing ones and its respective... modifications on the airplane Airworthiness Limitation Items (ALI), to include new inspections tasks or...
Thai student existing understanding about the solar system model and the motion of the stars
NASA Astrophysics Data System (ADS)
Anantasook, Sakanan; Yuenyong, Chokchai
2018-01-01
The paper examined Thai student existing understanding about the solar system model and the motion of the stars. The participants included 141 Grade 9 students in four different schools of the Surin province, Thailand. Methodology regarded interpretive paradigm. The tool of interpretation included the Student Celestial Motion Conception Questionnaire (SCMCQ) and informal interview. Given understandings in the SCMCQ were read through and categorized according to students' understandings. Then, students were further probed as informal interview. Students' understandings in each category were counted and percentages computed. Finally, students' understandings across four different schools were compared and contrasted using the percentage of student responses in each category. The findings revealed that most students understand about Sun-Moon-Earth (SME) system and solar system model as well, they can use scientific explanations to explain the celestial objects in solar system and how they orbiting. Unfortunately, most of students (more than 70%) never know about the Polaris, the North Star, and 90.1% of them never know about the ecliptic, and probably also the 12 zodiac constellations. These existing understanding suggested some ideas of teaching and learning about solar system model and the motion of the stars. The paper, then, discussed some learning activities to enhance students to further construct meaning about solar system model and the motion of the stars.
Dynamics and control of the ERK signaling pathway: Sensitivity, bistability, and oscillations.
Arkun, Yaman; Yasemi, Mohammadreza
2018-01-01
Cell signaling is the process by which extracellular information is transmitted into the cell to perform useful biological functions. The ERK (extracellular-signal-regulated kinase) signaling controls several cellular processes such as cell growth, proliferation, differentiation and apoptosis. The ERK signaling pathway considered in this work starts with an extracellular stimulus and ends with activated (double phosphorylated) ERK which gets translocated into the nucleus. We model and analyze this complex pathway by decomposing it into three functional subsystems. The first subsystem spans the initial part of the pathway from the extracellular growth factor to the formation of the SOS complex, ShC-Grb2-SOS. The second subsystem includes the activation of Ras which is mediated by the SOS complex. This is followed by the MAPK subsystem (or the Raf-MEK-ERK pathway) which produces the double phosphorylated ERK upon being activated by Ras. Although separate models exist in the literature at the subsystems level, a comprehensive model for the complete system including the important regulatory feedback loops is missing. Our dynamic model combines the existing subsystem models and studies their steady-state and dynamic interactions under feedback. We establish conditions under which bistability and oscillations exist for this important pathway. In particular, we show how the negative and positive feedback loops affect the dynamic characteristics that determine the cellular outcome.
Signal Partitioning Algorithm for Highly Efficient Gaussian Mixture Modeling in Mass Spectrometry
Polanski, Andrzej; Marczyk, Michal; Pietrowska, Monika; Widlak, Piotr; Polanska, Joanna
2015-01-01
Mixture - modeling of mass spectra is an approach with many potential applications including peak detection and quantification, smoothing, de-noising, feature extraction and spectral signal compression. However, existing algorithms do not allow for automated analyses of whole spectra. Therefore, despite highlighting potential advantages of mixture modeling of mass spectra of peptide/protein mixtures and some preliminary results presented in several papers, the mixture modeling approach was so far not developed to the stage enabling systematic comparisons with existing software packages for proteomic mass spectra analyses. In this paper we present an efficient algorithm for Gaussian mixture modeling of proteomic mass spectra of different types (e.g., MALDI-ToF profiling, MALDI-IMS). The main idea is automated partitioning of protein mass spectral signal into fragments. The obtained fragments are separately decomposed into Gaussian mixture models. The parameters of the mixture models of fragments are then aggregated to form the mixture model of the whole spectrum. We compare the elaborated algorithm to existing algorithms for peak detection and we demonstrate improvements of peak detection efficiency obtained by using Gaussian mixture modeling. We also show applications of the elaborated algorithm to real proteomic datasets of low and high resolution. PMID:26230717
Crisis Management: Research Summaries
ERIC Educational Resources Information Center
Brock, Stephen E., Ed.; Dorman, Sally; Anderson, Luke; McNair, Daniel
2013-01-01
This article presents summaries of three studies relevant to school crisis response. The first report, "A Framework for International Crisis Intervention" (Sally Dorman), is a review of how existing crisis intervention models (including the NASP PREPaRE model) have been adapted for international use. The second article, "Responding…
Estimating winter wheat phenological parameters: Implications for crop modeling
USDA-ARS?s Scientific Manuscript database
Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...
Numerical Modelling of Extended Leak-Off Test with a Pre-Existing Fracture
NASA Astrophysics Data System (ADS)
Lavrov, A.; Larsen, I.; Bauer, A.
2016-04-01
Extended leak-off test (XLOT) is one of the few techniques available for stress measurements in oil and gas wells. Interpretation of the test is often difficult since the results depend on a multitude of factors, including the presence of natural or drilling-induced fractures in the near-well area. Coupled numerical modelling of XLOT has been performed to investigate the pressure behaviour during the flowback phase as well as the effect of a pre-existing fracture on the test results in a low-permeability formation. Essential features of XLOT known from field measurements are captured by the model, including the saw-tooth shape of the pressure vs injected volume curve, and the change of slope in the pressure vs time curve during flowback used by operators as an indicator of the bottomhole pressure reaching the minimum in situ stress. Simulations with a pre-existing fracture running from the borehole wall in the radial direction have revealed that the results of XLOT are quite sensitive to the orientation of the pre-existing fracture. In particular, the fracture initiation pressure and the formation breakdown pressure increase steadily with decreasing angle between the fracture and the minimum in situ stress. Our findings seem to invalidate the use of the fracture initiation pressure and the formation breakdown pressure for stress measurements or rock strength evaluation purposes.
Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.
2017-01-01
Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.
Development of Accommodation Models for Soldiers in Vehicles: Squad
2014-09-01
average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data from a previous study of Soldier posture and position were analyzed to develop statistical...range of seat height and seat back angle. All of the models include the effects of body armor and body borne gear. 15. SUBJECT TERMS Anthropometry
CD-ROM: Potential and Pitfalls.
ERIC Educational Resources Information Center
Dreiss, L. Jack; Bashir, Shahzad
1990-01-01
Examines issues surrounding CD-ROM as an organizational information management tool: (1) the CD-ROM market; (2) pitfalls, including compatibility, effect on existing information systems, fear of obsolescence, protection of sensitive information, and lack of successful role models; and (3) factors that will fuel growth, including greater…
Fast Component Pursuit for Large-Scale Inverse Covariance Estimation.
Han, Lei; Zhang, Yu; Zhang, Tong
2016-08-01
The maximum likelihood estimation (MLE) for the Gaussian graphical model, which is also known as the inverse covariance estimation problem, has gained increasing interest recently. Most existing works assume that inverse covariance estimators contain sparse structure and then construct models with the ℓ 1 regularization. In this paper, different from existing works, we study the inverse covariance estimation problem from another perspective by efficiently modeling the low-rank structure in the inverse covariance, which is assumed to be a combination of a low-rank part and a diagonal matrix. One motivation for this assumption is that the low-rank structure is common in many applications including the climate and financial analysis, and another one is that such assumption can reduce the computational complexity when computing its inverse. Specifically, we propose an efficient COmponent Pursuit (COP) method to obtain the low-rank part, where each component can be sparse. For optimization, the COP method greedily learns a rank-one component in each iteration by maximizing the log-likelihood. Moreover, the COP algorithm enjoys several appealing properties including the existence of an efficient solution in each iteration and the theoretical guarantee on the convergence of this greedy approach. Experiments on large-scale synthetic and real-world datasets including thousands of millions variables show that the COP method is faster than the state-of-the-art techniques for the inverse covariance estimation problem when achieving comparable log-likelihood on test data.
User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs
Joseph E. Horn; E. Lee Medema; Ervin G. Schuster
1986-01-01
CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....
Management of California Oak Woodlands: Uncertainties and Modeling
Jay E. Noel; Richard P. Thompson
1995-01-01
A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...
Automotive Maintenance Data Base for Model Years 1976-1979. Part I
DOT National Transportation Integrated Search
1980-12-01
An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...
Banking Structure and Monetary Policy: New Wine in Old Bottles.
ERIC Educational Resources Information Center
Hacche, John
1989-01-01
Provides an extension of the basic banking model used in introductory economics courses. This expanded model introduces the concept of banking capital and reserves, and includes the relationship existing between current issues and banking structure and money supply growth. Provides worksheet exercises and answers. (LS)
Tissue and Animal Models of Sudden Cardiac Death
Sallam, Karim; Li, Yingxin; Sager, Philip T.; Houser, Steven R.; Wu, Joseph C.
2015-01-01
Sudden Cardiac Death (SCD) is a common cause of death in patients with structural heart disease, genetic mutations or acquired disorders affecting cardiac ion channels. A wide range of platforms exist to model and study disorders associated with SCD. Human clinical studies are cumbersome and are thwarted by the extent of investigation that can be performed on human subjects. Animal models are limited by their degree of homology to human cardiac electrophysiology including ion channel expression. Most commonly used cellular models are cellular transfection models, which are able to mimic the expression of a single ion channel offering incomplete insight into changes of the action potential profile. Induced pluripotent stem cell derived Cardiomyocytes (iPSC-CMs) resemble, but are not identical, to adult human cardiomyocytes, and provide a new platform for studying arrhythmic disorders leading to SCD. A variety of platforms exist to phenotype cellular models including conventional and automated patch clamp, multi-electrode array, and computational modeling. iPSC-CMs have been used to study Long QT syndrome, catecholaminergic polymorphic ventricular tachycardia, hypertrophic cardiomyopathy and other hereditary cardiac disorders. Although iPSC-CMs are distinct from adult cardiomyocytes, they provide a robust platform to advance the science and clinical care of SCD. PMID:26044252
A Solution to the Cosmic Conundrum including Cosmological Constant and Dark Energy Problems
NASA Astrophysics Data System (ADS)
Singh, A.
2009-12-01
A comprehensive solution to the cosmic conundrum is presented that also resolves key paradoxes of quantum mechanics and relativity. A simple mathematical model, the Gravity Nullification model (GNM), is proposed that integrates the missing physics of the spontaneous relativistic conversion of mass to energy into the existing physics theories, specifically a simplified general theory of relativity. Mechanistic mathematical expressions are derived for a relativistic universe expansion, which predict both the observed linear Hubble expansion in the nearby universe and the accelerating expansion exhibited by the supernova observations. The integrated model addresses the key questions haunting physics and Big Bang cosmology. It also provides a fresh perspective on the misconceived birth and evolution of the universe, especially the creation and dissolution of matter. The proposed model eliminates singularities from existing models and the need for the incredible and unverifiable assumptions including the superluminous inflation scenario, multiple universes, multiple dimensions, Anthropic principle, and quantum gravity. GNM predicts the observed features of the universe without any explicit consideration of time as a governing parameter.
NASA Technical Reports Server (NTRS)
MacLeod, Todd, C.; Ho, Fat Duen
2006-01-01
All present ferroelectric transistors have been made on the micrometer scale. Existing models of these devices do not take into account effects of nanoscale ferroelectric transistors. Understanding the characteristics of these nanoscale devices is important in developing a strategy for building and using future devices. This paper takes an existing microscale ferroelectric field effect transistor (FFET) model and adds effects that become important at a nanoscale level, including electron velocity saturation and direct tunneling. The new model analyzed FFETs ranging in length from 40,000 nanometers to 4 nanometers and ferroelectric thickness form 200 nanometers to 1 nanometer. The results show that FFETs can operate on the nanoscale but have some undesirable characteristics at very small dimensions.
Improving the accuracy of energy baseline models for commercial buildings with occupancy data
Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping
2016-07-07
More than 80% of energy is consumed during operation phase of a building's life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essentialmore » for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. To conclude, the results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.« less
L. D. Emberson; W. J. Massman; P. Buker; G. Soja; I. Van De Sand; G. Mills; C. Jacobs
2006-01-01
Currently, stomatal O3 flux and flux-response models only exist for wheat and potato (LRTAP Convention, 2004), as such there is a need to extend these models to include additional crop types. The possibility of establishing robust stomatal flux models for five agricultural crops (tomato, grapevine, sugar beet, maize and sunflower) was investigated. These crops were...
Mathematical analysis of a sharp-diffuse interfaces model for seawater intrusion
NASA Astrophysics Data System (ADS)
Choquet, C.; Diédhiou, M. M.; Rosier, C.
2015-10-01
We consider a new model mixing sharp and diffuse interface approaches for seawater intrusion phenomena in free aquifers. More precisely, a phase field model is introduced in the boundary conditions on the virtual sharp interfaces. We thus include in the model the existence of diffuse transition zones but we preserve the simplified structure allowing front tracking. The three-dimensional problem then reduces to a two-dimensional model involving a strongly coupled system of partial differential equations of parabolic type describing the evolution of the depths of the two free surfaces, that is the interface between salt- and freshwater and the water table. We prove the existence of a weak solution for the model completed with initial and boundary conditions. We also prove that the depths of the two interfaces satisfy a coupled maximum principle.
What controls the low ice number concentration in the upper troposphere?
NASA Astrophysics Data System (ADS)
Zhou, Cheng; Penner, Joyce E.; Lin, Guangxing; Liu, Xiaohong; Wang, Minghuai
2016-10-01
Cirrus clouds in the tropical tropopause play a key role in regulating the moisture entering the stratosphere through their dehydrating effect. Low ice number concentrations ( < 200 L-1) and high supersaturations (150-160 %) have been observed in these clouds. Different mechanisms have been proposed to explain these low ice number concentrations, including the inhibition of homogeneous freezing by the deposition of water vapour onto pre-existing ice crystals, heterogeneous ice formation on glassy organic aerosol ice nuclei (IN), and limiting the formation of ice number from high-frequency gravity waves. In this study, we examined the effect from three different representations of updraft velocities, the effect from pre-existing ice crystals, the effect from different water vapour deposition coefficients (α = 0.1 or 1), and the effect of 0.1 % of the total secondary organic aerosol (SOA) particles acting as IN. Model-simulated ice crystal numbers are compared against an aircraft observational dataset.Including the effect from water vapour deposition on pre-existing ice particles can effectively reduce simulated in-cloud ice number concentrations for all model setups. A larger water vapour deposition coefficient (α = 1) can also efficiently reduce ice number concentrations at temperatures below 205 K, but less so at higher temperatures. SOA acting as IN is most effective at reducing ice number concentrations when the effective updraft velocities are moderate ( ˜ 0.05-0.2 m s-1). However, the effects of including SOA as IN and using (α = 1) are diminished when the effect from pre-existing ice is included.When a grid-resolved large-scale updraft velocity ( < 0.1 m s-1) is used, the ice nucleation parameterization with homogeneous freezing only or with both homogeneous freezing and heterogeneous nucleation is able to generate low ice number concentrations in good agreement with observations for temperatures below 205 K as long as the pre-existing ice effect is included. For the moderate updraft velocity ( ˜ 0.05-0.2 m s-1), simulated ice number concentrations in good agreement with observations at temperatures below 205 K can be achieved if effects from pre-existing ice, a larger water vapour deposition coefficient (α = 1), and SOA IN are all included. Using the sub-grid-scale turbulent kinetic energy (TKE)-based updraft velocity ( ˜ 0-2 m s-1) always overestimates the ice number concentrations at temperatures below 205 K but compares well with observations at temperatures above 205 K when the pre-existing ice effect is included.
What controls the low ice number concentration in the upper troposphere?
NASA Astrophysics Data System (ADS)
Zhou, C.; Penner, J. E.; Lin, G.; Liu, X.; Wang, M.
2015-12-01
Cirrus clouds in the tropical tropopause play a key role in regulating the moisture entering the stratosphere through their dehydrating effect. Low ice number concentrations (< 200 L-1) and high supersaturations (150-160 %) have been observed in these clouds. Different mechanisms have been proposed to explain these low ice number concentrations, including the inhibition of homogeneous freezing by the deposition of water vapour onto pre-existing ice crystals, heterogeneous ice formation on glassy organic aerosol ice nuclei (IN), and limiting the formation of ice number from high frequency gravity waves. In this study, we examined the effect from three different representations of updraft velocities, the effect from pre-existing ice crystals, the effect from different water vapour deposition coefficients (α = 0.1 or 1), and the effect of 0.1 % of the total secondary organic aerosol (SOA) particles acting as IN. Model simulated ice crystal numbers are compared against an aircraft observational dataset. Including the effect from water vapour deposition on pre-existing ice particles can effectively reduce simulated in-cloud ice number concentrations for all model set-ups. A larger water vapour deposition coefficient (α = 1) can also efficiently reduce ice number concentrations at temperatures below 205 K but less so at higher temperatures. SOA acting as IN are most effective at reducing ice number concentrations when the effective updraft velocities are moderate (∼ 0.05-0.2 m s-1). However, the effects of including SOA as IN and using (α = 1) are diminished when the effect from pre-existing ice is included. When a grid resolved large-scale updraft velocity (< 0.1 m s-1) is used, the ice nucleation parameterization with homogeneous freezing only or with both homogeneous freezing and heterogeneous nucleation is able to generate low ice number concentrations in good agreement with observations for temperatures below 205 K as long as the pre-existing ice effect is included. For the moderate updraft velocity (∼ 0.05-0.2 m s-1) simulated ice number concentrations in good agreement with observations at temperatures below 205 K can be achieved if effects from pre-existing ice, a larger water vapour deposition coefficient (α = 1) and SOA IN are all included. Using the sub-grid scale turbulent kinetic energy based updraft velocity (∼ 0-2 m s-1) always overestimates the ice number concentrations at temperatures below 205 K but compares well with observations at temperatures above 205 K when the pre-existing ice effect is included.
NASA Astrophysics Data System (ADS)
Battistella, C.; Robinson, D.; McQuarrie, N.; Ghoshal, S.
2017-12-01
Multiple valid balanced cross sections can be produced from mapped surface and subsurface data. By integrating low temperature thermochronologic data, we are better able to predict subsurface geometries. Existing valid balanced cross section for far western Nepal are few (Robinson et al., 2006) and do not incorporate thermochronologic data because the data did not exist. The data published along the Simikot cross section along the Karnali River since then include muscovite Ar, zircon U-Th/He and apatite fission track. We present new mapping and a new valid balanced cross section that takes into account the new field data as well as the limitations that thermochronologic data places on the kinematics of the cross section. Additional constrains include some new geomorphology data acquired since 2006 that indicate areas of increased vertical uplift, which indicate locations of buried ramps in the Main Himalayan thrust and guide the locations of Lesser Himalayan ramps in the balanced cross section. Future work will include flexural modeling, new low temperature thermochronometic data, and 2-D thermokinematic models from a sequentially forward modeled balanced cross sections in far western Nepal.
Theory, Guidance, and Flight Control for High Maneuverability Projectiles
2014-01-01
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining...2.8 Linear System Modeling with Time Delay ...................................................................22 2.9 Linear System Modeling Without... Time Delay .............................................................23 3. Guidance and Flight Control 24 3.1 Proportional Navigation Guidance Law
Automotive Maintenance Data Base for Model Years 1976-1979. Part II : Appendix E and F
DOT National Transportation Integrated Search
1980-12-01
An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...
Teacher Learning in the Digital Age: Online Professional Development in STEM Education
ERIC Educational Resources Information Center
Dede, Chris, Ed.; Eisenkraft, Arthur, Ed.; Frumin, Kim, Ed.; Hartley, Alex, Ed.
2016-01-01
With an emphasis on science, technology, engineering, and mathematics (STEM) training, "Teacher Learning in the Digital Age" examines exemplary models of online and blended teacher professional development, including information on the structure and design of each model, intended audience, and existing research and evaluation data. From…
Examining, Documenting, and Modeling the Problem Space of a Variable Domain
2002-06-14
Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain
Designing Corporate Databases to Support Technology Innovation
ERIC Educational Resources Information Center
Gultz, Michael Jarett
2012-01-01
Based on a review of the existing literature on database design, this study proposed a unified database model to support corporate technology innovation. This study assessed potential support for the model based on the opinions of 200 technology industry executives, including Chief Information Officers, Chief Knowledge Officers and Chief Learning…
Rethinking Validation in Complex High-Stakes Assessment Contexts
ERIC Educational Resources Information Center
Koch, Martha J.; DeLuca, Christopher
2012-01-01
In this article we rethink validation within the complex contexts of high-stakes assessment. We begin by considering the utility of existing models for validation and argue that these models tend to overlook some of the complexities inherent to assessment use, including the multiple interpretations of assessment purposes and the potential…
Indigenous Models of Therapy in Traditional Asian Societies.
ERIC Educational Resources Information Center
Das, Ajit K.
1987-01-01
Presents an overview of some indigenous ways of understanding and dealing with psychological disorders in the traditional societies of Asia. Indigenous approaches to healing and psychotherapy existing in India, China, and Japan are included. Models of healing in these three societies are classified as folk traditions, mystical traditions, and…
CometQuest: A Rosetta Adventure
NASA Technical Reports Server (NTRS)
Leon, Nancy J.; Fisher, Diane K.; Novati, Alexander; Chmielewski, Artur B.; Fitzpatrick, Austin J.; Angrum, Andrea
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
Understanding and Improving Modifiable Cardiovascular Risks within the Air Force
2013-10-04
Promotion Model ( HPM ). Findings: The definition of health included exercise, proper eating, sleep, and a spiritual connection, as well as the absence of...to health behaviors, including what it takes to be healthy, knowing oneself, and existing Air Force policies. The HPM did not fully address all of the...was used to arrange the data into data-driven themes. These themes were then compared to the elements of the Health Promotion Model ( HPM
The flow of power law fluids in elastic networks and porous media.
Sochi, Taha
2016-02-01
The flow of power law fluids, which include shear thinning and shear thickening as well as Newtonian as a special case, in networks of interconnected elastic tubes is investigated using a residual-based pore scale network modeling method with the employment of newly derived formulae. Two relations describing the mechanical interaction between the local pressure and local cross-sectional area in distensible tubes of elastic nature are considered in the derivation of these formulae. The model can be used to describe shear dependent flows of mainly viscous nature. The behavior of the proposed model is vindicated by several tests in a number of special and limiting cases where the results can be verified quantitatively or qualitatively. The model, which is the first of its kind, incorporates more than one major nonlinearity corresponding to the fluid rheology and conduit mechanical properties, that is non-Newtonian effects and tube distensibility. The formulation, implementation, and performance indicate that the model enjoys certain advantages over the existing models such as being exact within the restricting assumptions on which the model is based, easy implementation, low computational costs, reliability, and smooth convergence. The proposed model can, therefore, be used as an alternative to the existing Newtonian distensible models; moreover, it stretches the capabilities of the existing modeling approaches to reach non-Newtonian rheologies.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
“That model is sooooo last millennium!” Residential long term care as a system, not a place
Ziemba, Rosemary; Perry, Tam E.; Takahashi, Beverly; Algase, Donna
2010-01-01
The current quandary with the design of existing long term care (LTC) settings results from focus on structures (“institutions”) instead of on a system of supports and services that transcends physical and traditional boundaries across settings, including nursing homes, assisted living residences and the home. Supported by analysis of the commonalities, socio-historical and political contexts, core values and fallacies of social and medical models in existing and emerging LTC options, a holistic model is proposed based on new core values which facilitate community and family integration, and which asserts dignity and personhood as universal attributes in an array of settings. PMID:20640176
Warming Up to STS. Activities to Encourage Environmental Awareness.
ERIC Educational Resources Information Center
Rosenthal, Dorothy B.
1990-01-01
Developed is an interdisciplinary unit that deals with global warming and the greenhouse effect. Included are 10 lessons that can be used to supplement existing plans or used as a basis for developing a new unit. Included are modeling, laboratory, graphing, role-playing, and discussion activities. (KR)
Investigation and Development of Data-Driven D-Region Model for HF Systems Impacts
NASA Technical Reports Server (NTRS)
Eccles, J. V.; Rice, D.; Sojka, J. J.; Hunsucker, R. D.
2002-01-01
Space Environment Corporation (SEC) and RP Consultants (RPC) are to develop and validate a weather-capable D region model for making High Frequency (HF) absorption predictions in support of the HF communications and radar communities. The weather-capable model will assimilate solar and earth space observations from NASA satellites. The model will account for solar-induced impacts on HF absorption, including X-rays, Solar Proton Events (SPE's), and auroral precipitation. The work plan includes: I . Optimize D-region model to quickly obtain ion and electron densities for proper HF absorption calculations. 2. Develop indices-driven modules for D-region ionization sources for low, mid, & high latitudes including X-rays, cosmic rays, auroral precipitation, & solar protons. (Note: solar spectrum & auroral modules already exist). 3. Setup low-cost monitors of existing HF beacons and add one single-frequency beacon. 4. Use PENEX HF-link database with HF monitor data to validate D-region/HF absorption model using climatological ionization drivers. 5. Develop algorithms to assimilate NASA satellite data of solar, interplanetary, and auroral observations into ionization source modules. 6. Use PENEX HF-link & HF-beacon data for skill score comparison of assimilation versus climatological D-region/HF absorption model. Only some satellites are available for the PENEX time period, thus, HF-beacon data is necessary. 7. Use HF beacon monitors to develop HF-link data assimilation algorithms for regional improvement to the D-region/HF absorption model.
Including Finite Surface Span Effects in Empirical Jet-Surface Interaction Noise Models
NASA Technical Reports Server (NTRS)
Brown, Clifford A.
2016-01-01
The effect of finite span on the jet-surface interaction noise source and the jet mixing noise shielding and reflection effects is considered using recently acquired experimental data. First, the experimental setup and resulting data are presented with particular attention to the role of surface span on far-field noise. These effects are then included in existing empirical models that have previously assumed that all surfaces are semi-infinite. This extended abstract briefly describes the experimental setup and data leaving the empirical modeling aspects for the final paper.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
Modeling and Analysis of a Nonlinear Age-Structured Model for Tumor Cell Populations with Quiescence
NASA Astrophysics Data System (ADS)
Liu, Zijian; Chen, Jing; Pang, Jianhua; Bi, Ping; Ruan, Shigui
2018-05-01
We present a nonlinear first-order hyperbolic partial differential equation model to describe age-structured tumor cell populations with proliferating and quiescent phases at the avascular stage in vitro. The division rate of the proliferating cells is assumed to be nonlinear due to the limitation of the nutrient and space. The model includes a proportion of newborn cells that enter directly the quiescent phase with age zero. This proportion can reflect the effect of treatment by drugs such as erlotinib. The existence and uniqueness of solutions are established. The local and global stabilities of the trivial steady state are investigated. The existence and local stability of the positive steady state are also analyzed. Numerical simulations are performed to verify the results and to examine the impacts of parameters on the nonlinear dynamics of the model.
Breaking barriers to novel analgesic drug development.
Yekkirala, Ajay S; Roberson, David P; Bean, Bruce P; Woolf, Clifford J
2017-08-01
Acute and chronic pain complaints, although common, are generally poorly served by existing therapies. This unmet clinical need reflects a failure to develop novel classes of analgesics with superior efficacy, diminished adverse effects and a lower abuse liability than those currently available. Reasons for this include the heterogeneity of clinical pain conditions, the complexity and diversity of underlying pathophysiological mechanisms, and the unreliability of some preclinical pain models. However, recent advances in our understanding of the neurobiology of pain are beginning to offer opportunities for developing novel therapeutic strategies and revisiting existing targets, including modulating ion channels, enzymes and G-protein-coupled receptors.
Breaking barriers to novel analgesic drug development
Yekkirala, Ajay S; Roberson, David P; Bean, Bruce P.; Woolf, Clifford J.
2017-01-01
Acute and chronic pain complaints, while very common, are generally poorly served by existing therapies. The unmet clinical need reflects the failure in developing novel classes of analgesics with superior efficacy, diminished adverse effects and a lower abuse liability than those currently available. Reasons for this include the heterogeneity of clinical pain conditions, the complexity and diversity of underlying pathophysiological mechanisms coupled with the unreliability of some preclinical pain models. However, recent advances in our understanding of the neurobiology of pain are beginning to offer opportunities to develop new therapeutic strategies and revisit existing targets, including modulating ion channels, enzymes and GPCRs. PMID:28596533
Existence and Stability of Viscoelastic Shock Profiles
NASA Astrophysics Data System (ADS)
Barker, Blake; Lewicka, Marta; Zumbrun, Kevin
2011-05-01
We investigate existence and stability of viscoelastic shock profiles for a class of planar models including the incompressible shear case studied by Antman and Malek-Madani. We establish that the resulting equations fall into the class of symmetrizable hyperbolic-parabolic systems, hence spectral stability implies linearized and nonlinear stability with sharp rates of decay. The new contributions are treatment of the compressible case, formulation of a rigorous nonlinear stability theory, including verification of stability of small-amplitude Lax shocks, and the systematic incorporation in our investigations of numerical Evans function computations determining stability of large-amplitude and nonclassical type shock profiles.
Prospects for rebuilding primary care using the patient-centered medical home.
Landon, Bruce E; Gill, James M; Antonelli, Richard C; Rich, Eugene C
2010-05-01
Existing research suggests that models of enhanced primary care lead to health care systems with better performance. What the research does not show is whether such an approach is feasible or likely to be effective within the U.S. health care system. Many commentators have adopted the model of the patient-centered medical home as policy shorthand to address the reinvention of primary care in the United States. We analyze potential barriers to implementing the medical home model for policy makers and practitioners. Among others, these include developing new payment models, as well as the need for up-front funding to assemble the personnel and infrastructure required by an enhanced non-visit-based primary care practice and methods to facilitate transformation of existing practices to functioning medical homes.
Mesoscopic and continuum modelling of angiogenesis
Spill, F.; Guerrero, P.; Alarcon, T.; Maini, P. K.; Byrne, H. M.
2016-01-01
Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. PMID:24615007
NASA Astrophysics Data System (ADS)
Ullrich, Paul A.; Jablonowski, Christiane; Kent, James; Lauritzen, Peter H.; Nair, Ramachandran; Reed, Kevin A.; Zarzycki, Colin M.; Hall, David M.; Dazlich, Don; Heikes, Ross; Konor, Celal; Randall, David; Dubos, Thomas; Meurdesoif, Yann; Chen, Xi; Harris, Lucas; Kühnlein, Christian; Lee, Vivian; Qaddouri, Abdessamad; Girard, Claude; Giorgetta, Marco; Reinert, Daniel; Klemp, Joseph; Park, Sang-Hun; Skamarock, William; Miura, Hiroaki; Ohno, Tomoki; Yoshida, Ryuji; Walko, Robert; Reinecke, Alex; Viner, Kevin
2017-12-01
Atmospheric dynamical cores are a fundamental component of global atmospheric modeling systems and are responsible for capturing the dynamical behavior of the Earth's atmosphere via numerical integration of the Navier-Stokes equations. These systems have existed in one form or another for over half of a century, with the earliest discretizations having now evolved into a complex ecosystem of algorithms and computational strategies. In essence, no two dynamical cores are alike, and their individual successes suggest that no perfect model exists. To better understand modern dynamical cores, this paper aims to provide a comprehensive review of 11 non-hydrostatic dynamical cores, drawn from modeling centers and groups that participated in the 2016 Dynamical Core Model Intercomparison Project (DCMIP) workshop and summer school. This review includes a choice of model grid, variable placement, vertical coordinate, prognostic equations, temporal discretization, and the diffusion, stabilization, filters, and fixers employed by each system.
Mammographic density, breast cancer risk and risk prediction
Vachon, Celine M; van Gils, Carla H; Sellers, Thomas A; Ghosh, Karthik; Pruthi, Sandhya; Brandt, Kathleen R; Pankratz, V Shane
2007-01-01
In this review, we examine the evidence for mammographic density as an independent risk factor for breast cancer, describe the risk prediction models that have incorporated density, and discuss the current and future implications of using mammographic density in clinical practice. Mammographic density is a consistent and strong risk factor for breast cancer in several populations and across age at mammogram. Recently, this risk factor has been added to existing breast cancer risk prediction models, increasing the discriminatory accuracy with its inclusion, albeit slightly. With validation, these models may replace the existing Gail model for clinical risk assessment. However, absolute risk estimates resulting from these improved models are still limited in their ability to characterize an individual's probability of developing cancer. Promising new measures of mammographic density, including volumetric density, which can be standardized using full-field digital mammography, will likely result in a stronger risk factor and improve accuracy of risk prediction models. PMID:18190724
Comparison of Coupled Radiative Flow Solutions with Project Fire 2 Flight Data
NASA Technical Reports Server (NTRS)
Olynick, David R.; Henline, W. D.; Chambers, Lin Hartung; Candler, G. V.
1995-01-01
A nonequilibrium, axisymmetric, Navier-Stokes flow solver with coupled radiation has been developed for use in the design or thermal protection systems for vehicles where radiation effects are important. The present method has been compared with an existing now and radiation solver and with the Project Fire 2 experimental data. Good agreement has been obtained over the entire Fire 2 trajectory with the experimentally determined values of the stagnation radiation intensity in the 0.2-6.2 eV range and with the total stagnation heating. The effects of a number of flow models are examined to determine which combination of physical models produces the best agreement with the experimental data. These models include radiation coupling, multitemperature thermal models, and finite rate chemistry. Finally, the computational efficiency of the present model is evaluated. The radiation properties model developed for this study is shown to offer significant computational savings compared to existing codes.
Mitigating Insider Sabotage and Espionage: A Review of the United States Air Force’s Current Posture
2009-03-01
published on ins ider threat, to include the variables that come into play and historical case studies. Existing insider threat models are discussed ...problem, including the initial development of a logical da ta mod el and a system dynamics model. This chapter also discusses the selection of the...Finally, Chapter V provides a summary of the research along with a discussion of its conclusions and impact. Recommendations for future research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooker, A.; Gonder, J.; Lopp, S.
The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution ofmore » importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.« less
Modeling a distribution of point defects as misfitting inclusions in stressed solids
NASA Astrophysics Data System (ADS)
Cai, W.; Sills, R. B.; Barnett, D. M.; Nix, W. D.
2014-05-01
The chemical equilibrium distribution of point defects modeled as non-overlapping, spherical inclusions with purely positive dilatational eigenstrain in an isotropically elastic solid is derived. The compressive self-stress inside existing inclusions must be excluded from the stress dependence of the equilibrium concentration of the point defects, because it does no work when a new inclusion is introduced. On the other hand, a tensile image stress field must be included to satisfy the boundary conditions in a finite solid. Through the image stress, existing inclusions promote the introduction of additional inclusions. This is contrary to the prevailing approach in the literature in which the equilibrium point defect concentration depends on a homogenized stress field that includes the compressive self-stress. The shear stress field generated by the equilibrium distribution of such inclusions is proved to be proportional to the pre-existing stress field in the solid, provided that the magnitude of the latter is small, so that a solid containing an equilibrium concentration of point defects can be described by a set of effective elastic constants in the small-stress limit.
Collaborative Care in Schools: Enhancing Integration and Impact in Youth Mental Health
Lyon, Aaron R.; Whitaker, Kelly; French, William P.; Richardson, Laura P.; Wasse, Jessica Knaster; McCauley, Elizabeth
2016-01-01
Collaborative Care is an innovative approach to integrated mental health service delivery that focuses on reducing access barriers, improving service quality, and lowering healthcare expenditures. A large body of evidence supports the effectiveness of Collaborative Care models with adults and, increasingly, for youth. Although existing studies examining these models for youth have focused exclusively on primary care, the education sector is also an appropriate analog for the accessibility that primary care offers to adults. Collaborative Care aligns closely with the practical realities of the education sector and may represent a strategy to achieve some of the objectives of increasingly popular multi-tiered systems of supports frameworks. Unfortunately, no resources exist to guide the application of Collaborative Care models in schools. Based on the existing evidence for Collaborative Care models, the current paper (1) provides a rationale for the adaptation of Collaborative Care models to improve mental health service accessibility and effectiveness in the education sector; (2) presents a preliminary Collaborative Care model for use in schools; and (3) describes avenues for research surrounding school-based Collaborative Care, including the currently funded Accessible, Collaborative Care for Effective School-based Services (ACCESS) project. PMID:28392832
Speed and Delay Prediction Models for Planning Applications
DOT National Transportation Integrated Search
1999-01-01
Estimation of vehicle speed and delay is fundamental to many forms of : transportation planning analyses including air quality, long-range travel : forecasting, major investment studies, and congestion management systems. : However, existing planning...
Thermal Modeling and Cryogenic Design of a Helical Superconducting Undulator Cryostat
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiroyanagi, Y.; Fuerst, J.; Hasse, Q.
A conceptual design for a helical superconducting undulator (HSCU) for the Advanced Photon Source (APS) at Argonne National Laboratory (ANL) has been completed. The device differs sufficiently from the existing APS planar superconducting undulator (SCU) design to warrant development of a new cryostat based on value engineering and lessons learned from the existing planar SCU. Changes include optimization of the existing cryocooler-based refrigeration system and thermal shield as well as cost reduction through the use of standard vacuum hardware. The end result is a design that provides significantly larger 4.2 K refrigeration margin in a smaller package for greater installationmore » flexibility in the APS storage ring. This paper presents ANSYS-based thermal analysis of the cryostat, including estimated static and dynamic« less
Forestry sector analysis for developing countries: issues and methods.
R.W. Haynes
1993-01-01
A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...
Helping Working Parents: Child Care Options for Business.
ERIC Educational Resources Information Center
North Carolina State Dept. of Administration, Raleigh.
Seven models representing the existing range of options of employer involvement in day care are described in this paper. The range of options are grouped into two categories: (1) company owned, operated, or subsidized child day care; and (2) employee assistance services, benefits, and policies. The models included in the first category are the…
Research on Model of Student Engagement in Online Learning
ERIC Educational Resources Information Center
Peng, Wang
2017-01-01
In this study, online learning refers students under the guidance of teachers through the online learning platform for organized learning. Based on the analysis of related research results, considering the existing problems, the main contents of this paper include the following aspects: (1) Analyze and study the current student engagement model.…
Development of a biorefinery optimized biofuel supply curve for the western United States
Nathan Parker; Peter Tittmann; Quinn Hart; Richard Nelson; Ken Skog; Anneliese Schmidt; Edward Gray; Bryan Jenkins
2010-01-01
A resource assessment and biorefinery siting optimization model was developed and implemented to assess potential biofuel supply across the Western United States from agricultural, forest, urban, and energy crop biomass. Spatial information including feedstock resources, existing and potential refinery locations and a transportation network model is provided to a mixed...
Surface-Charge-Based Micro-Models--A Solid Foundation for Learning about Direct Current Circuits
ERIC Educational Resources Information Center
Hirvonen, P. E.
2007-01-01
This study explores how the use of a surface-charge-based instructional approach affects introductory university level students' understanding of direct current (dc) circuits. The introduced teaching intervention includes electrostatics, surface-charge-based micro-models that explain the existence of an electric field inside the current-carrying…
The Impact of Childhood Cancer: A Two-Factor Model of Coping.
ERIC Educational Resources Information Center
Zevon, Michael A.; Armstrong, Gordon D.
A review of existing stress and coping models and an analysis of the distress caused by childhood cancer suggest that a broader conceptualization of coping that includes "pleasure management" is needed. Presently, successful coping is identified as the employment of strategies which allow the individual to adapt to stress. Traditional…
Video Modeling for Children and Adolescents with Autism Spectrum Disorder: A Meta-Analysis
ERIC Educational Resources Information Center
Thompson, Teresa Lynn
2014-01-01
The objective of this research was to conduct a meta-analysis to examine existing research studies on video modeling as an effective teaching tool for children and adolescents diagnosed with Autism Spectrum Disorder (ASD). Study eligibility criteria included (a) single case research design using multiple baselines, alternating treatment designs,…
From Children's Perspectives: A Model of Aesthetic Processing in Theatre
ERIC Educational Resources Information Center
Klein, Jeanne
2005-01-01
While several developmental models of aesthetic understanding, experience, and appreciation exist in the realms of visual art and music education, few examples have been proposed in regard to theatre, particularly for child audiences. This author argues that children gaze upon theatre in differential ways by including age as a variable…
ERIC Educational Resources Information Center
Hunt, Pete; Barrios, Lisa; Telljohann, Susan K.; Mazyck, Donna
2015-01-01
Background: The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. Methods: The existing literature, including scientific articles,…
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
Pranav, P K; Patel, Thaneswer
2016-04-07
Manual orange harvesting is very laborious, time consuming and unsafe operation whereas neither mechanical harvesting nor mechanized hand harvesting is possible in north-east India due to its hilly terrains. The awkward postures and repetitive nature of work in orange harvesting, demands a comfortable and appropriate hand harvester for hilly region. The purpose of this study was to develop a manual orange harvester for hilly regions considering the ergonomic parameters, and compare the performance with the existing models of the manual harvester. In this study twenty healthy experienced orchard workers (10 male and 10 female) participated who did not have any previous functional musculoskeletal disorders. We developed a manual orange harvester by eliminating the problems associated with the existing harvesters. The developed model along with existing models was evaluated extensively in the field. During evaluations, heart rate of the subjects was measured and oxygen consumption was predicted to calculate the energy expenditure rate (EER) from the established relationship in the laboratory before the field experiments. Further, performance parameters of orange harvester i.e. plucking rate (PR), damaged quantity (DQ), plucking energy requirement (PER) and discomfort rating were also observed. The PR was 425, 300 and 287 pieces per hour for the developed model (DM), first existing model (EM1) and second existing model (EM2), respectively. The DM showed lower PER (2.14 kJ/piece) followed by EM2 (2.95 kJ/piece) and EM1 (4.02 kJ/piece) which is considered as overall performance as it includes energy per unit of plucking. Further, the body part discomfort score revealed that DM was more comfortable in use followed by EM2 and EM1. The performance of the DM was found better in terms of plucking rate, energy requirement and body part discomfort than the other existing models. Shoulders and neck are the most affected body parts where all subjects felt severe discomfort.
Perceptions of Peer Sexual Behavior: Do Adolescents Believe in a Sexual Double Standard?
Young, Michael; Cardenas, Susan; Donnelly, Joseph; J Kittleson, Mark
2016-12-01
The purpose of the study was to (1) examine attitudes of adolescents toward peer models having sex or choosing abstinence, and (2) determine whether a "double standard" in perception existed concerning adolescent abstinence and sexual behavior. Adolescents (N = 173) completed questionnaires that included 1 of 6 randomly assigned vignettes that described male and female peer models 3 ways: (1) no information about model's sexual behavior, (2) model in love but choosing abstinence, and (3) model in love and having sex. Participants read the vignette to which they had been assigned and responded to statements about the peer model. Data were analyzed using multivariate analysis of variance. Results did not show evidence of a sexual double standard among male participants, but did show some evidence of a sexual double standard among female participants. Additionally, both male and female participants evaluated more harshly peer models that were having sex than peer models that chose abstinence. Findings provide insight concerning the lack of a sexual double standard among male participants, the existence, to some degree, of a sexual double standard among female participants, and demonstrate the existence of a social cost to both young men and young women for choosing to have sex. © 2016, American School Health Association.
Learning Layouts for Single-Page Graphic Designs.
O'Donovan, Peter; Agarwala, Aseem; Hertzmann, Aaron
2014-08-01
This paper presents an approach for automatically creating graphic design layouts using a new energy-based model derived from design principles. The model includes several new algorithms for analyzing graphic designs, including the prediction of perceived importance, alignment detection, and hierarchical segmentation. Given the model, we use optimization to synthesize new layouts for a variety of single-page graphic designs. Model parameters are learned with Nonlinear Inverse Optimization (NIO) from a small number of example layouts. To demonstrate our approach, we show results for applications including generating design layouts in various styles, retargeting designs to new sizes, and improving existing designs. We also compare our automatic results with designs created using crowdsourcing and show that our approach performs slightly better than novice designers.
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.
2014-12-01
Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.
Hilkens, N A; Algra, A; Greving, J P
2016-01-01
ESSENTIALS: Prediction models may help to identify patients at high risk of bleeding on antiplatelet therapy. We identified existing prediction models for bleeding and validated them in patients with cerebral ischemia. Five prediction models were identified, all of which had some methodological shortcomings. Performance in patients with cerebral ischemia was poor. Background Antiplatelet therapy is widely used in secondary prevention after a transient ischemic attack (TIA) or ischemic stroke. Bleeding is the main adverse effect of antiplatelet therapy and is potentially life threatening. Identification of patients at increased risk of bleeding may help target antiplatelet therapy. This study sought to identify existing prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy and evaluate their performance in patients with cerebral ischemia. We systematically searched PubMed and Embase for existing prediction models up to December 2014. The methodological quality of the included studies was assessed with the CHARMS checklist. Prediction models were externally validated in the European Stroke Prevention Study 2, comprising 6602 patients with a TIA or ischemic stroke. We assessed discrimination and calibration of included prediction models. Five prediction models were identified, of which two were developed in patients with previous cerebral ischemia. Three studies assessed major bleeding, one studied intracerebral hemorrhage and one gastrointestinal bleeding. None of the studies met all criteria of good quality. External validation showed poor discriminative performance, with c-statistics ranging from 0.53 to 0.64 and poor calibration. A limited number of prediction models is available that predict intracranial hemorrhage or major bleeding in patients on antiplatelet therapy. The methodological quality of the models varied, but was generally low. Predictive performance in patients with cerebral ischemia was poor. In order to reliably predict the risk of bleeding in patients with cerebral ischemia, development of a prediction model according to current methodological standards is needed. © 2015 International Society on Thrombosis and Haemostasis.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Toward a descriptive model of galactic cosmic rays in the heliosphere
NASA Technical Reports Server (NTRS)
Mewaldt, R. A.; Cummings, A. C.; Adams, James H., Jr.; Evenson, Paul; Fillius, W.; Jokipii, J. R.; Mckibben, R. B.; Robinson, Paul A., Jr.
1988-01-01
Researchers review the elements that enter into phenomenological models of the composition, energy spectra, and the spatial and temporal variations of galactic cosmic rays, including the so-called anomalous cosmic ray component. Starting from an existing model, designed to describe the behavior of cosmic rays in the near-Earth environment, researchers suggest possible updates and improvements to this model, and then propose a quantitative approach for extending such a model into other regions of the heliosphere.
DSN system performance test Doppler noise models; noncoherent configuration
NASA Technical Reports Server (NTRS)
Bunce, R.
1977-01-01
The newer model for variance, the Allan technique, now adopted for testing, is analyzed in the subject mode. A model is generated (including considerable contribution from the station secondary frequency standard), and rationalized with existing data. The variance model is definitely sound; the Allan technique mates theory and measure. The mean-frequency model is an estimate; this problem is yet to be rigorously resolved. The unaltered defining expressions are noncovergent, and the observed mean is quite erratic.
ERIC Educational Resources Information Center
Grobmeier, Cynthia
2007-01-01
Relating knowledge management (KM) case studies in various organizational contexts to existing theoretical constructs of learning organizations, a new model, the MIKS (Member Integrated Knowledge System) Model is proposed to include the role of the individual in the process. Their degree of motivation as well as communication and learning…
Women at the Top: Powerful Leaders Define Success as Work + Family in a Culture of Gender
ERIC Educational Resources Information Center
Cheung, Fanny M.; Halpern, Diane F.
2010-01-01
How do women rise to the top of their professions when they also have significant family care responsibilities? This critical question has not been addressed by existing models of leadership. In a review of recent research, we explore an alternative model to the usual notion of a Western male as the prototypical leader. The model includes (a)…
A Community Terrain-Following Ocean Modeling System (ROMS/TOMS)
2011-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. A Community Terrain-Following Ocean Modeling System (ROMS...732) 932-6555 x266 Fax: (732) 932-6520 email: arango@marine.rutgers.edu Award Number: N00014-10- 1 -0322 http://ocean-modeling.org http...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information necessary to use the LOVES computer program in its existing state or to modify the program to include studies not properly handled by the basic model is provided. A users guide, a programmers manual, and several supporting appendices are included.
75 FR 17887 - Airworthiness Directives; The Boeing Company Model 767 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-08
... torque to the nut and bolt of the main track downstop assembly. The corrective actions include: Installing a bolt and spacer with a new nut (including applying torque to make sure that it has been.... Tightening the existing nut. Boeing Special Attention Service Bulletin 767-57-0118, dated October 8, 2009...
NASA Astrophysics Data System (ADS)
Pei, Yangwen; Paton, Douglas A.; Wu, Kongyou; Xie, Liujuan
2017-08-01
The application of trishear algorithm, in which deformation occurs in a triangle zone in front of a propagating fault tip, is often used to understand fault related folding. In comparison to kink-band methods, a key characteristic of trishear algorithm is that non-uniform deformation within the triangle zone allows the layer thickness and horizon length to change during deformation, which is commonly observed in natural structures. An example from the Lenghu5 fold-and-thrust belt (Qaidam Basin, Northern Tibetan Plateau) is interpreted to help understand how to employ trishear forward modelling to improve the accuracy of seismic interpretation. High resolution fieldwork data, including high-angle dips, 'dragging structures', thinning hanging-wall and thickening footwall, are used to determined best-fit trishear model to explain the deformation happened to the Lenghu5 fold-and-thrust belt. We also consider the factors that increase the complexity of trishear models, including: (a) fault-dip changes and (b) pre-existing faults. We integrate fault dip change and pre-existing faults to predict subsurface structures that are apparently under seismic resolution. The analogue analysis by trishear models indicates that the Lenghu5 fold-and-thrust belt is controlled by an upward-steepening reverse fault above a pre-existing opposite-thrusting fault in deeper subsurface. The validity of the trishear model is confirmed by the high accordance between the model and the high-resolution fieldwork. The validated trishear forward model provides geometric constraints to the faults and horizons in the seismic section, e.g., fault cutoffs and fault tip position, faults' intersecting relationship and horizon/fault cross-cutting relationship. The subsurface prediction using trishear algorithm can significantly increase the accuracy of seismic interpretation, particularly in seismic sections with low signal/noise ratio.
NASA Technical Reports Server (NTRS)
Urquhart, Erin A.; Zaitchik, Benjamin F.; Waugh, Darryn W.; Guikema, Seth D.; Del Castillo, Carlos E.
2014-01-01
The effect that climate change and variability will have on waterborne bacteria is a topic of increasing concern for coastal ecosystems, including the Chesapeake Bay. Surface water temperature trends in the Bay indicate a warming pattern of roughly 0.3-0.4 C per decade over the past 30 years. It is unclear what impact future warming will have on pathogens currently found in the Bay, including Vibrio spp. Using historical environmental data, combined with three different statistical models of Vibrio vulnificus probability, we explore the relationship between environmental change and predicted Vibrio vulnificus presence in the upper Chesapeake Bay. We find that the predicted response of V. vulnificus probability to high temperatures in the Bay differs systematically between models of differing structure. As existing publicly available datasets are inadequate to determine which model structure is most appropriate, the impact of climatic change on the probability of V. vulnificus presence in the Chesapeake Bay remains uncertain. This result points to the challenge of characterizing climate sensitivity of ecological systems in which data are sparse and only statistical models of ecological sensitivity exist.
Mixed raster content (MRC) model for compound image compression
NASA Astrophysics Data System (ADS)
de Queiroz, Ricardo L.; Buckley, Robert R.; Xu, Ming
1998-12-01
This paper will describe the Mixed Raster Content (MRC) method for compressing compound images, containing both binary test and continuous-tone images. A single compression algorithm that simultaneously meets the requirements for both text and image compression has been elusive. MRC takes a different approach. Rather than using a single algorithm, MRC uses a multi-layered imaging model for representing the results of multiple compression algorithms, including ones developed specifically for text and for images. As a result, MRC can combine the best of existing or new compression algorithms and offer different quality-compression ratio tradeoffs. The algorithms used by MRC set the lower bound on its compression performance. Compared to existing algorithms, MRC has some image-processing overhead to manage multiple algorithms and the imaging model. This paper will develop the rationale for the MRC approach by describing the multi-layered imaging model in light of a rate-distortion trade-off. Results will be presented comparing images compressed using MRC, JPEG and state-of-the-art wavelet algorithms such as SPIHT. MRC has been approved or proposed as an architectural model for several standards, including ITU Color Fax, IETF Internet Fax, and JPEG 2000.
Crayton, Elise; Wolfe, Charles; Douiri, Abdel
2018-01-01
Objective We aim to identify and critically appraise clinical prediction models of mortality and function following ischaemic stroke. Methods Electronic databases, reference lists, citations were searched from inception to September 2015. Studies were selected for inclusion, according to pre-specified criteria and critically appraised by independent, blinded reviewers. The discrimination of the prediction models was measured by the area under the curve receiver operating characteristic curve or c-statistic in random effects meta-analysis. Heterogeneity was measured using I2. Appropriate appraisal tools and reporting guidelines were used in this review. Results 31395 references were screened, of which 109 articles were included in the review. These articles described 66 different predictive risk models. Appraisal identified poor methodological quality and a high risk of bias for most models. However, all models precede the development of reporting guidelines for prediction modelling studies. Generalisability of models could be improved, less than half of the included models have been externally validated(n = 27/66). 152 predictors of mortality and 192 predictors and functional outcome were identified. No studies assessing ability to improve patient outcome (model impact studies) were identified. Conclusions Further external validation and model impact studies to confirm the utility of existing models in supporting decision-making is required. Existing models have much potential. Those wishing to predict stroke outcome are advised to build on previous work, to update and adapt validated models to their specific contexts opposed to designing new ones. PMID:29377923
Capacity planning in a transitional economy: What issues? Which models?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mubayi, V.; Leigh, R.W.; Bright, R.N.
1996-03-01
This paper is devoted to an exploration of the important issues facing the Russian power generation system and its evolution in the foreseeable future and the kinds of modeling approaches that capture those issues. These issues include, for example, (1) trade-offs between investments in upgrading and refurbishment of existing thermal (fossil-fired) capacity and safety enhancements in existing nuclear capacity versus investment in new capacity, (2) trade-offs between investment in completing unfinished (under construction) projects based on their original design versus investment in new capacity with improved design, (3) incorporation of demand-side management options (investments in enhancing end-use efficiency, for example)more » within the planning framework, (4) consideration of the spatial dimensions of system planning including investments in upgrading electric transmission networks or fuel shipment networks and incorporating hydroelectric generation, (5) incorporation of environmental constraints and (6) assessment of uncertainty and evaluation of downside risk. Models for exploring these issues include low power shutdown (LPS) which are computationally very efficient, though approximate, and can be used to perform extensive sensitivity analyses to more complex models which can provide more detailed answers but are computationally cumbersome and can only deal with limited issues. The paper discusses which models can usefully treat a wide range of issues within the priorities facing decision makers in the Russian power sector and integrate the results with investment decisions in the wider economy.« less
Global models for synthetic fuels planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamontagne, J.
1983-10-01
This study was performed to identify the set of existing global models with the best potential for use in the US Synthetic Fuels Corporation's strategic planning process, and to recommend the most appropriate model. The study was limited to global models with representations that encompass time horizons beyond the year 2000, multiple fuel forms, and significant regional detail. Potential accessibility to the Synthetic Fuels Corporation and adequate documentation were also required. Four existing models (LORENDAS, WIM, IIASA, and IEA/ORAU) were judged to be the best candidates for the SFC's use at this time; none of the models appears to bemore » ideal for the SFC's purposes. On the basis of currently available information, the most promising short-term option open to the SFC is the use of LORENDAS, with careful attention to definition of alternative energy demand scenarios. Longer-term options which deserve further study are coupling LORENDAS with an explicit model of energy demand, and modification of the IEA/ORAU model to include finer time-period definition and additional technological detail.« less
Extending radiative transfer models by use of Bayes rule. [in atmospheric science
NASA Technical Reports Server (NTRS)
Whitney, C.
1977-01-01
This paper presents a procedure that extends some existing radiative transfer modeling techniques to problems in atmospheric science where curvature and layering of the medium and dynamic range and angular resolution of the signal are important. Example problems include twilight and limb scan simulations. Techniques that are extended include successive orders of scattering, matrix operator, doubling, Gauss-Seidel iteration, discrete ordinates and spherical harmonics. The procedure for extending them is based on Bayes' rule from probability theory.
An open source hydroeconomic model for California's water supply system: PyVIN
NASA Astrophysics Data System (ADS)
Dogan, M. S.; White, E.; Herman, J. D.; Hart, Q.; Merz, J.; Medellin-Azuara, J.; Lund, J. R.
2016-12-01
Models help operators and decision makers explore and compare different management and policy alternatives, better allocate scarce resources, and predict the future behavior of existing or proposed water systems. Hydroeconomic models are useful tools to increase benefits or decrease costs of managing water. Bringing hydrology and economics together, these models provide a framework for different disciplines that share similar objectives. This work proposes a new model to evaluate operation and adaptation strategies under existing and future hydrologic conditions for California's interconnected water system. This model combines the network structure of CALVIN, a statewide optimization model for California's water infrastructure, along with an open source solver written in the Python programming language. With the flexibilities of the model, reservoir operations, including water supply and hydropower, groundwater pumping, and the Delta water operations and requirements can now be better represented. Given time series of hydrologic inputs to the model, typical outputs include urban, agricultural and wildlife refuge water deliveries and shortage costs, conjunctive use of surface and groundwater systems, and insights into policy and management decisions, such as capacity expansion and groundwater management policies. Water market operations also represented in the model, allocating water from lower-valued users to higher-valued users. PyVIN serves as a cross-platform, extensible model to evaluate systemwide water operations. PyVIN separates data from the model structure, enabling model to be easily applied to other parts of the world where water is a scarce resource.
Sakurai Prize: Extended Higgs Sectors--phenomenology and future prospects
NASA Astrophysics Data System (ADS)
Gunion, John
2017-01-01
The discovery of a spin-0 state at 125 GeV with properties close to those predicted for the single Higgs boson of the Standard Model does not preclude the existence of additional Higgs bosons. In this talk, models with extended Higgs sectors are reviewed, including two-Higgs-doublet models with and without an extra singlet Higgs field and supersymmetric models. Special emphasis is given to the limit in which the couplings and properties of one of the Higgs bosons of the extended Higgs sector are very close to those predicted for the single Standard Model Higgs boson while the other Higgs bosons are relatively light, perhaps even having masses close to or below the SM-like 125 GeV state. Constraints on this type of scenario given existing data are summarized and prospects for observing these non-SM-like Higgs bosons are discussed. Supported by the Department of Energy.
Predicted carbonation of existing concrete building based on the Indonesian tropical micro-climate
NASA Astrophysics Data System (ADS)
Hilmy, M.; Prabowo, H.
2018-03-01
This paper is aimed to predict the carbonation progress based on the previous mathematical model. It shortly explains the nature of carbonation including the processes and effects. Environmental humidity and temperature of the existing concrete building are measured and compared to data from local Meteorological, Climatological, and Geophysical Agency. The data gained are expressed in the form of annual hygrothermal values which will use as the input parameter in carbonation model. The physical properties of the observed building such as its location, dimensions, and structural material used are quantified. These data then utilized as an important input parameter for carbonation coefficients. The relationships between relative humidity and the rate of carbonation established. The results can provide a basis for repair and maintenance of existing concrete buildings and the sake of service life analysis of them.
Business Models for Cost Sharing & Capability Sustainment
2012-08-18
digital technology into existing mechanical products and their supporting processes can only work correctly if the firm carrying it out changes its entire...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...Capability Sustainment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK
Elliptically Framed Tip-Tilt Mirror Optimized for Stellar Tracking
2015-01-01
a rotating frame. We used the same materials as the existing tracker; however, light-weighted both the aluminum frame and Zerodur ® mirror . We...as the existing tracker; however, light-weighted both the aluminum frame and Zerodur mirror . We generated a computer-aided design model, converted it...components include an aluminum yoke and ring, glass Zerodur ®4 mirror , piezoelectric (PZT) actuators and stainless steel flexure pivot bearings5. Fig. 1
Building America Energy Renovations. A Business Case for Home Performance Contracting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baechler, Michael C.; Antonopoulos, C. A.; Sevigny, M.
2012-10-01
This research report gives an overview of the needs and opportunities that exist in the U.S. home performance contracting industry. The report discusses industry trends, market drivers, different business models, and points of entry for existing and new businesses hoping to enter the home performance contracting industry. Case studies of eight companies who successfully entered the industry are provided, including business metrics, start-up costs, and marketing approaches.
An integrated decision support system for TRAC: A proposal
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
1991-01-01
Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.
Damage identification using inverse methods.
Friswell, Michael I
2007-02-15
This paper gives an overview of the use of inverse methods in damage detection and location, using measured vibration data. Inverse problems require the use of a model and the identification of uncertain parameters of this model. Damage is often local in nature and although the effect of the loss of stiffness may require only a small number of parameters, the lack of knowledge of the location means that a large number of candidate parameters must be included. This paper discusses a number of problems that exist with this approach to health monitoring, including modelling error, environmental effects, damage localization and regularization.
Colliding Stellar Wind Models with Orbital Motion
NASA Astrophysics Data System (ADS)
Wilkin, Francis P.; O'Connor, Brendan
2018-01-01
We present thin-shell models for the collision between two ballistic stellar winds, including orbital motion.The stellar orbits are assumed circular, so that steady-state solutions exist in the rotating frame, where we include centrifugal and Coriolis forces. Exact solutions for the pre-shock winds are incorporated. Here we discuss 2-D model results for equal wind momentum-loss rates, although we allow for the winds to have distinct speeds and mass loss rates. For these unequal wind conditions, we obtain a clear violation of skew-symmetry, despite equal momentum loss rates, due to the Coriolis force.
Artificial Immune Algorithm for Subtask Industrial Robot Scheduling in Cloud Manufacturing
NASA Astrophysics Data System (ADS)
Suma, T.; Murugesan, R.
2018-04-01
The current generation of manufacturing industry requires an intelligent scheduling model to achieve an effective utilization of distributed manufacturing resources, which motivated us to work on an Artificial Immune Algorithm for subtask robot scheduling in cloud manufacturing. This scheduling model enables a collaborative work between the industrial robots in different manufacturing centers. This paper discussed two optimizing objectives which includes minimizing the cost and load balance of industrial robots through scheduling. To solve these scheduling problems, we used the algorithm based on Artificial Immune system. The parameters are simulated with MATLAB and the results compared with the existing algorithms. The result shows better performance than existing.
Learning Aggregation Operators for Preference Modeling
NASA Astrophysics Data System (ADS)
Torra, Vicenç
Aggregation operators are useful tools for modeling preferences. Such operators include weighted means, OWA and WOWA operators, as well as some fuzzy integrals, e.g. Choquet and Sugeno integrals. To apply these operators in an effective way, their parameters have to be properly defined. In this chapter, we review some of the existing tools for learning these parameters from examples.
ERIC Educational Resources Information Center
Herman, Melissa R.
This paper describes the achievement patterns of a sample of 1,492 multiracial high school students and examines how their achievement fits into existing theoretical models that explain monoracial differences in achievement. These theoretical models include status attainment, parenting style, oppositional culture, and educational attitudes. The…
Cellular-based modeling of oscillatory dynamics in brain networks.
Skinner, Frances K
2012-08-01
Oscillatory, population activities have long been known to occur in our brains during different behavioral states. We know that many different cell types exist and that they contribute in distinct ways to the generation of these activities. I review recent papers that involve cellular-based models of brain networks, most of which include theta, gamma and sharp wave-ripple activities. To help organize the modeling work, I present it from a perspective of three different types of cellular-based modeling: 'Generic', 'Biophysical' and 'Linking'. Cellular-based modeling is taken to encompass the four features of experiment, model development, theory/analyses, and model usage/computation. The three modeling types are shown to include these features and interactions in different ways. Copyright © 2012 Elsevier Ltd. All rights reserved.
Towards a Framework for Modeling Space Systems Architectures
NASA Technical Reports Server (NTRS)
Shames, Peter; Skipper, Joseph
2006-01-01
Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.
Bohme, Andrea; van Rienen, Ursula
2016-08-01
Computational modeling of the stimulating field distribution during Deep Brain Stimulation provides an opportunity to advance our knowledge of this neurosurgical therapy for Parkinson's disease. There exist several approaches to model the target region for Deep Brain Stimulation in Hemi-parkinson Rats with volume conductor models. We have described and compared the normalized mapping approach as well as the modeling with three-dimensional structures, which include curvilinear coordinates to assure an anatomically realistic conductivity tensor orientation.
1976-03-01
atmosphere,as well as very fine grid cloud models and cloud probability models. Some of the new requirements that will be supported with this system are a...including the Advanced Prediction Model for the global atmosphere, as well as very fine grid cloud models and cloud proba- bility models. Some of the new...with the mapping and gridding function (imput and output)? Should the capability exist to interface raw ungridded data with the SID interface
Muon g - 2 in the aligned two Higgs doublet model
Han, Tao; Kang, Sin Kyu; Sayre, Joshua
2016-02-16
In this paper, we study the Two-Higgs-Doublet Model with the aligned Yukawa sector (A2HDM) in light of the observed excess measured in the muon anomalous magnetic moment. We take into account the existing theoretical and experimental constraints with up-to-date values and demonstrate that a phenomenologically interesting region of parameter space exists. With a detailed parameter scan, we show a much larger region of viable parameter space in this model beyond the limiting case Type X 2HDM as obtained before. It features the existence of light scalar states with masses 3 GeV ≲ m H ≲ 50 GeV, or 10 GeVmore » ≲ m A ≲ 130 GeV, with enhanced couplings to tau leptons. The charged Higgs boson is typically heavier, with 200 GeV ≲ m H+ ≲ 630 GeV. The surviving parameter space is forced into the CP-conserving limit by EDM constraints. Some Standard Model observables may be significantly modified, including a possible new decay mode of the SMlike Higgs boson to four taus. Lastly, we comment on future measurements and direct searches for those effects at the LHC as tests of the model.« less
Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan
2017-01-01
The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products.
A Nationwide Survey of Patient Centered Medical Home Demonstration Projects
Bitton, Asaf; Martin, Carina
2010-01-01
Background The patient centered medical home has received considerable attention as a potential way to improve primary care quality and limit cost growth. Little information exists that systematically compares PCMH pilot projects across the country. Design Cross-sectional key-informant interviews. Participants Leaders from existing PCMH demonstration projects with external payment reform. Measurements We used a semi-structured interview tool with the following domains: project history, organization and participants, practice requirements and selection process, medical home recognition, payment structure, practice transformation, and evaluation design. Results A total of 26 demonstrations in 18 states were interviewed. Current demonstrations include over 14,000 physicians caring for nearly 5 million patients. A majority of demonstrations are single payer, and most utilize a three component payment model (traditional fee for service, per person per month fixed payments, and bonus performance payments). The median incremental revenue per physician per year was $22,834 (range $720 to $91,146). Two major practice transformation models were identified—consultative and implementation of the chronic care model. A majority of demonstrations did not have well-developed evaluation plans. Conclusion Current PCMH demonstration projects with external payment reform include large numbers of patients and physicians as well as a wide spectrum of implementation models. Key questions exist around the adequacy of current payment mechanisms and evaluation plans as public and policy interest in the PCMH model grows. Electronic supplementary material The online version of this article (doi:10.1007/s11606-010-1262-8) contains supplementary material, which is available to authorized users. PMID:20467907
Advanced Waveform Simulation for Seismic Monitoring
2008-09-01
velocity model. The method separates the main arrivals of the regional waveform into 5 windows: Pnl (vertical and radial components), Rayleigh (vertical and...ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D...existing models perform in predicting the various regional phases, Rayleigh waves, Love waves, and Pnl waves. Previous events from this Basin-and-Range
40 CFR 60.5095 - What must I include in the notifications of achievement of increments of progress?
Code of Federal Regulations, 2011 CFR
2011-07-01
... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Existing Sewage Sludge Incineration Units Model Rule...
Worker experiences of accessibility in post-Katrina New Orleans.
DOT National Transportation Integrated Search
2013-06-01
Existing research has identified transportation challenges that low-income workers face, including a : spatial mismatch between suburban entry level-jobs and urban low-income workers. These studies rely on travel : models and secondary data and thus ...
Wang, Shu-Mi; Lai, Chien-Yu
2010-04-01
This article describes a nurse's experience using Neuman's Systems Model to care for a chronic psychiatric patient and his caregiver. The patient was diagnosed as suffering from neuroleptic malignant syndrome (NMS). Nursing care described in this article was administered from October 23 to December 4, 2007. The patient developed NMS in the third month of a three-month period of hospitalization, which endangered his life as well as the health of his caregiver. Nursing care was provided to the patient and his caregiver based on Neuman's Systems Model, which included assessments of intrapersonal, interpersonal, and extra-personal forces as well as of environmental factors affecting the health of the patient and his caregiver. The four nursing care issues identified included: existing self-care deficit, sensory/perceptual alteration, sleep pattern disturbance, and caregiver role strain. Following Neuman's systems model, primary, secondary, and tertiary prevention were used to strengthen the flexible lines of defense, internal lines of resistance, and supporting existing strengths of both patient and caregiver, as well as to conserve client system energy. Significant improvements in patient and caregiver abilities were apparent in nursing intervention outcomes. This experience shows the Neuman's systems model to be an efficient model in psychiatric nursing care.
Sallam, Karim; Li, Yingxin; Sager, Philip T; Houser, Steven R; Wu, Joseph C
2015-06-05
Sudden cardiac death is a common cause of death in patients with structural heart disease, genetic mutations, or acquired disorders affecting cardiac ion channels. A wide range of platforms exist to model and study disorders associated with sudden cardiac death. Human clinical studies are cumbersome and are thwarted by the extent of investigation that can be performed on human subjects. Animal models are limited by their degree of homology to human cardiac electrophysiology, including ion channel expression. Most commonly used cellular models are cellular transfection models, which are able to mimic the expression of a single-ion channel offering incomplete insight into changes of the action potential profile. Induced pluripotent stem cell-derived cardiomyocytes resemble, but are not identical, adult human cardiomyocytes and provide a new platform for studying arrhythmic disorders leading to sudden cardiac death. A variety of platforms exist to phenotype cellular models, including conventional and automated patch clamp, multielectrode array, and computational modeling. Induced pluripotent stem cell-derived cardiomyocytes have been used to study long QT syndrome, catecholaminergic polymorphic ventricular tachycardia, hypertrophic cardiomyopathy, and other hereditary cardiac disorders. Although induced pluripotent stem cell-derived cardiomyocytes are distinct from adult cardiomyocytes, they provide a robust platform to advance the science and clinical care of sudden cardiac death. © 2015 American Heart Association, Inc.
Welbourn, Richard; Dixon, John; Barth, Julian H; Finer, Nicholas; Hughes, Carly A; le Roux, Carel W; Wass, John
2016-03-01
Despite increasing prevalence of obesity, no country has successfully implemented comprehensive pathways to provide advice to all the severely obese patients that seek treatment. We aimed to formulate pathways for referral into and out of weight assessment and management clinics (WAMCs) that include internal medicine/primary care physicians as part of a multidisciplinary team that could provide specialist advice and interventions, including referral for bariatric surgery. Using a National Institute of Health and Care Excellence (NICE)-accredited process, a Guidance Development Group conducted a literature search identifying existing WAMCs. As very few examples of effective structures and clinical pathways existed, the current evidence base for optimal assessment and management of bariatric surgery patients was used to reach a consensus. The model we describe could be adopted internationally by health services to manage severely obese patients.
NASA Astrophysics Data System (ADS)
Adrich, Przemysław
2016-05-01
In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.
Dynamics of a Class of HIV Infection Models with Cure of Infected Cells in Eclipse Stage.
Maziane, Mehdi; Lotfi, El Mehdi; Hattaf, Khalid; Yousfi, Noura
2015-12-01
In this paper, we propose two HIV infection models with specific nonlinear incidence rate by including a class of infected cells in the eclipse phase. The first model is described by ordinary differential equations (ODEs) and generalizes a set of previously existing models and their results. The second model extends our ODE model by taking into account the diffusion of virus. Furthermore, the global stability of both models is investigated by constructing suitable Lyapunov functionals. Finally, we check our theoretical results with numerical simulations.
A survey of Applied Psychological Services' models of the human operator
NASA Technical Reports Server (NTRS)
Siegel, A. I.; Wolf, J. J.
1979-01-01
A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
Computational Fluid Dynamics Modeling of Nickel Hydrogen Batteries
NASA Technical Reports Server (NTRS)
Cullion, R.; Gu, W. B.; Wang, C. Y.; Timmerman, P.
2000-01-01
An electrochemical Ni-H2 battery model has been expanded to include thermal effects. A thermal energy conservation equation was derived from first principles. An electrochemical and thermal coupled model was created by the addition of this equation to an existing multiphase, electrochemical model. Charging at various rates was investigated and the results validated against experimental data. Reaction currents, pressure changes, temperature profiles, and concentration variations within the cell are predicted numerically and compared with available data and theory.
Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw
2015-02-01
A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Riley, W. J.; Maggi, F. M.; Kleber, M.; Torn, M. S.; Tang, J. Y.; Dwivedi, D.; Guerry, N.
2014-01-01
Accurate representation of soil organic matter (SOM) dynamics in Earth System Models is critical for future climate prediction, yet large uncertainties exist regarding how, and to what extent, the suite of proposed relevant mechanisms should be included. To investigate how various mechanisms interact to influence SOM storage and dynamics, we developed a SOM reaction network integrated in a one-dimensional, multi-phase, and multi-component reactive transport solver. The model includes representations of bacterial and fungal activity, multiple archetypal polymeric and monomeric carbon substrate groups, aqueous chemistry, aqueous advection and diffusion, gaseous diffusion, and adsorption (and protection) and desorption from the soil mineral phase. The model predictions reasonably matched observed depth-resolved SOM and dissolved organic carbon (DOC) stocks in grassland ecosystems as well as lignin content and fungi to aerobic bacteria ratios. We performed a suite of sensitivity analyses under equilibrium and dynamic conditions to examine the role of dynamic sorption, microbial assimilation rates, and carbon inputs. To our knowledge, observations do not exist to fully test such a complicated model structure or to test the hypotheses used to explain observations of substantial storage of very old SOM below the rooting depth. Nevertheless, we demonstrated that a reasonable combination of sorption parameters, microbial biomass and necromass dynamics, and advective transport can match observations without resorting to an arbitrary depth-dependent decline in SOM turnover rates, as is often done. We conclude that, contrary to assertions derived from existing turnover time based model formulations, observed carbon content and δ14C vertical profiles are consistent with a representation of SOM dynamics consisting of (1) carbon compounds without designated intrinsic turnover times, (2) vertical aqueous transport, and (3) dynamic protection on mineral surfaces.
NASA Astrophysics Data System (ADS)
Riley, W. J.; Maggi, F.; Kleber, M.; Torn, M. S.; Tang, J. Y.; Dwivedi, D.; Guerry, N.
2014-07-01
Accurate representation of soil organic matter (SOM) dynamics in Earth system models is critical for future climate prediction, yet large uncertainties exist regarding how, and to what extent, the suite of proposed relevant mechanisms should be included. To investigate how various mechanisms interact to influence SOM storage and dynamics, we developed an SOM reaction network integrated in a one-dimensional, multi-phase, and multi-component reactive transport solver. The model includes representations of bacterial and fungal activity, multiple archetypal polymeric and monomeric carbon substrate groups, aqueous chemistry, aqueous advection and diffusion, gaseous diffusion, and adsorption (and protection) and desorption from the soil mineral phase. The model predictions reasonably matched observed depth-resolved SOM and dissolved organic matter (DOM) stocks and fluxes, lignin content, and fungi to aerobic bacteria ratios. We performed a suite of sensitivity analyses under equilibrium and dynamic conditions to examine the role of dynamic sorption, microbial assimilation rates, and carbon inputs. To our knowledge, observations do not exist to fully test such a complicated model structure or to test the hypotheses used to explain observations of substantial storage of very old SOM below the rooting depth. Nevertheless, we demonstrated that a reasonable combination of sorption parameters, microbial biomass and necromass dynamics, and advective transport can match observations without resorting to an arbitrary depth-dependent decline in SOM turnover rates, as is often done. We conclude that, contrary to assertions derived from existing turnover time based model formulations, observed carbon content and Δ14C vertical profiles are consistent with a representation of SOM consisting of carbon compounds with relatively fast reaction rates, vertical aqueous transport, and dynamic protection on mineral surfaces.
Numerical Modeling of River Ice Processes on the Lower Nelson River
NASA Astrophysics Data System (ADS)
Malenchak, Jarrod Joseph
Water resource infrastructure in cold regions of the world can be significantly impacted by the existence of river ice. Major engineering concerns related to river ice include ice jam flooding, the design and operation of hydropower facilities and other hydraulic structures, water supplies, as well as ecological, environmental, and morphological effects. The use of numerical simulation models has been identified as one of the most efficient means by which river ice processes can be studied and the effects of river ice be evaluated. The continued advancement of these simulation models will help to develop new theories and evaluate potential mitigation alternatives for these ice issues. In this thesis, a literature review of existing river ice numerical models, of anchor ice formation and modeling studies, and of aufeis formation and modeling studies is conducted. A high level summary of the two-dimensional CRISSP numerical model is presented as well as the developed freeze-up model with a focus specifically on the anchor ice and aufeis growth processes. This model includes development in the detailed heat transfer calculations, an improved surface ice mass exchange model which includes the rapids entrainment process, and an improved dry bed treatment model along with the expanded anchor ice and aufeis growth model. The developed sub-models are tested in an ideal channel setting as somewhat of a model confirmation. A case study of significant anchor ice and aufeis growth on the Nelson River in northern Manitoba, Canada, will be the primary field test case for the anchor ice and aufeis model. A second case study on the same river will be used to evaluate the surface ice components of the model in a field setting. The results from these cases studies will be used to highlight the capabilities and deficiencies in the numerical model and to identify areas of further research and model development.
Aerosol Modeling for the Global Model Initiative
NASA Technical Reports Server (NTRS)
Weisenstein, Debra K.; Ko, Malcolm K. W.
2001-01-01
The goal of this project is to develop an aerosol module to be used within the framework of the Global Modeling Initiative (GMI). The model development work will be preformed jointly by the University of Michigan and AER, using existing aerosol models at the two institutions as starting points. The GMI aerosol model will be tested, evaluated against observations, and then applied to assessment of the effects of aircraft sulfur emissions as needed by the NASA Subsonic Assessment in 2001. The work includes the following tasks: 1. Implementation of the sulfur cycle within GMI, including sources, sinks, and aqueous conversion of sulfur. Aerosol modules will be added as they are developed and the GMI schedule permits. 2. Addition of aerosol types other than sulfate particles, including dust, soot, organic carbon, and black carbon. 3. Development of new and more efficient parameterizations for treating sulfate aerosol nucleation, condensation, and coagulation among different particle sizes and types.
Community models for wildlife impact assessment: a review of concepts and approaches
Schroeder, Richard L.
1987-01-01
The first two sections of this paper are concerned with defining and bounding communities, and describing those attributes of the community that are quantifiable and suitable for wildlife impact assessment purposes. Prior to the development or use of a community model, it is important to have a clear understanding of the concept of a community and a knowledge of the types of community attributes that can serve as outputs for the development of models. Clearly defined, unambiguous model outputs are essential for three reasons: (1) to ensure that the measured community attributes relate to the wildlife resource objectives of the study; (2) to allow testing of the outputs in experimental studies, to determine accuracy, and to allow for improvements based on such testing; and (3) to enable others to clearly understand the community attribute that has been measured. The third section of this paper described input variables that may be used to predict various community attributes. These input variables do not include direct measures of wildlife populations. Most impact assessments involve projects that result in drastic changes in habitat, such as changes in land use, vegetation, or available area. Therefore, the model input variables described in this section deal primarily with habitat related features. Several existing community models are described in the fourth section of this paper. A general description of each model is provided, including the nature of the input variables and the model output. The logic and assumptions of each model are discussed, along with data requirements needed to use the model. The fifth section provides guidance on the selection and development of community models. Identification of the community attribute that is of concern will determine the type of model most suitable for a particular application. This section provides guidelines on selected an existing model, as well as a discussion of the major steps to be followed in modifying an existing model or developing a new model. Considerations associated with the use of community models with the Habitat Evaluation Procedures are also discussed. The final section of the paper summarizes major findings of interest to field biologists and provides recommendations concerning the implementation of selected concepts in wildlife community analyses.
Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma
2010-01-01
In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.
Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M
2012-10-01
To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.
Yobbi, D.K.
2000-01-01
A nonlinear least-squares regression technique for estimation of ground-water flow model parameters was applied to an existing model of the regional aquifer system underlying west-central Florida. The regression technique minimizes the differences between measured and simulated water levels. Regression statistics, including parameter sensitivities and correlations, were calculated for reported parameter values in the existing model. Optimal parameter values for selected hydrologic variables of interest are estimated by nonlinear regression. Optimal estimates of parameter values are about 140 times greater than and about 0.01 times less than reported values. Independently estimating all parameters by nonlinear regression was impossible, given the existing zonation structure and number of observations, because of parameter insensitivity and correlation. Although the model yields parameter values similar to those estimated by other methods and reproduces the measured water levels reasonably accurately, a simpler parameter structure should be considered. Some possible ways of improving model calibration are to: (1) modify the defined parameter-zonation structure by omitting and/or combining parameters to be estimated; (2) carefully eliminate observation data based on evidence that they are likely to be biased; (3) collect additional water-level data; (4) assign values to insensitive parameters, and (5) estimate the most sensitive parameters first, then, using the optimized values for these parameters, estimate the entire data set.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Containment Sodium Chemistry Models in MELCOR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.; Denman, Matthew R
To meet regulatory needs for sodium fast reactors’ future development, including licensing requirements, Sandia National Laboratories is modernizing MELCOR, a severe accident analysis computer code developed for the U.S. Nuclear Regulatory Commission (NRC). Specifically, Sandia is modernizing MELCOR to include the capability to model sodium reactors. However, Sandia’s modernization effort primarily focuses on the containment response aspects of the sodium reactor accidents. Sandia began modernizing MELCOR in 2013 to allow a sodium coolant, rather than water, for conventional light water reactors. In the past three years, Sandia has been implementing the sodium chemistry containment models in CONTAIN-LMR, a legacy NRCmore » code, into MELCOR. These chemistry models include spray fire, pool fire and atmosphere chemistry models. Only the first two chemistry models have been implemented though it is intended to implement all these models into MELCOR. A new package called “NAC” has been created to manage the sodium chemistry model more efficiently. In 2017 Sandia began validating the implemented models in MELCOR by simulating available experiments. The CONTAIN-LMR sodium models include sodium atmosphere chemistry and sodium-concrete interaction models. This paper presents sodium property models, the implemented models, implementation issues, and a path towards validation against existing experimental data.« less
Modeling the Restraint of Liquid Jets by Surface Tension in Microgravity
NASA Technical Reports Server (NTRS)
Chato, David J.; Jacqmim, David A.
2001-01-01
An axisymmetric phase field model is developed and used to model surface tension forces on liquid jets in microgravity. The previous work in this area is reviewed and a baseline drop tower experiment selected 'for model comparison. A mathematical model is developed which includes a free surface. a symmetric centerline and wall boundaries with given contact angles. The model is solved numerically with a compact fourth order stencil on a equally spaced axisymmetric grid. After grid convergence studies, a grid is selected and all drop tower tests modeled. Agreement was assessed by comparing predicted and measured free surface rise. Trend wise agreement is good but agreement in magnitude is only fair. Suspected sources of disagreement are suspected to be lack of a turbulence model and the existence of slosh baffles in the experiment which were not included in the model.
NASA Technical Reports Server (NTRS)
Zwick, H.; Ward, V.; Beaudette, L.
1973-01-01
A critical evaluation of existing optical remote sensors for HCl vapor detection in solid propellant rocket plumes is presented. The P branch of the fundamental vibration-rotation band was selected as the most promising spectral feature to sense. A computation of transmittance for HCl vapor, an estimation of interferent spectra, the application of these spectra to computer modelled remote sensors, and a trade-off study for instrument recommendation are also included.
REVIEWS OF TOPICAL PROBLEMS: Physical aspects of cryobiology
NASA Astrophysics Data System (ADS)
Zhmakin, A. I.
2008-03-01
Physical phenomena during biological freezing and thawing processes at the molecular, cellular, tissue, and organ levels are examined. The basics of cryosurgery and cryopreservation of cells and tissues are presented. Existing cryobiological models, including numerical ones, are reviewed.
ADOT state-specific crash prediction models : an Arizona needs study.
DOT National Transportation Integrated Search
2016-12-01
The predictive method in the Highway Safety Manual (HSM) includes a safety performance function (SPF), : crash modification factors (CMFs), and a local calibration factor (C), if available. Two alternatives exist for : applying the HSM prediction met...
ERIC Educational Resources Information Center
Shipman, Jean; Homan, Michael
2003-01-01
Discusses how librarians in the new role of "informationist" can help doctors and researches of medical information. Describes existing models of the informationist; potential benefits of working across professional boundaries outside the library; professional requirements; and unresolved issues for the new role, including potential…
Hopf-link topological nodal-loop semimetals
NASA Astrophysics Data System (ADS)
Zhou, Yao; Xiong, Feng; Wan, Xiangang; An, Jin
2018-04-01
We construct a generic two-band model which can describe topological semimetals with multiple closed nodal loops. All the existing multi-nodal-loop semimetals, including the nodal-net, nodal-chain, and Hopf-link states, can be examined within the same framework. Based on a two-nodal-loop model, the corresponding drumhead surface states for these topologically different bulk states are studied and compared with each other. The connection of our model with Hopf insulators is also discussed. Furthermore, to identify experimentally these topologically different semimetal states, especially to distinguish the Hopf-link from unlinked ones, we also investigate their Landau levels. It is found that the Hopf-link state can be characterized by the existence of a quadruply degenerate zero-energy Landau band, regardless of the direction of the magnetic field.
Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration
NASA Astrophysics Data System (ADS)
Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang
Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.
A toolbox and a record for scientific model development
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.
A scoping review of indirect comparison methods and applications using individual patient data.
Veroniki, Areti Angeliki; Straus, Sharon E; Soobiah, Charlene; Elliott, Meghan J; Tricco, Andrea C
2016-04-27
Several indirect comparison methods, including network meta-analyses (NMAs), using individual patient data (IPD) have been developed to synthesize evidence from a network of trials. Although IPD indirect comparisons are published with increasing frequency in health care literature, there is no guidance on selecting the appropriate methodology and on reporting the methods and results. In this paper we examine the methods and reporting of indirect comparison methods using IPD. We searched MEDLINE, Embase, the Cochrane Library, and CINAHL from inception until October 2014. We included published and unpublished studies reporting a method, application, or review of indirect comparisons using IPD and at least three interventions. We identified 37 papers, including a total of 33 empirical networks. Of these, only 9 (27 %) IPD-NMAs reported the existence of a study protocol, whereas 3 (9 %) studies mentioned that protocols existed without providing a reference. The 33 empirical networks included 24 (73 %) IPD-NMAs and 9 (27 %) matching adjusted indirect comparisons (MAICs). Of the 21 (64 %) networks with at least one closed loop, 19 (90 %) were IPD-NMAs, 13 (68 %) of which evaluated the prerequisite consistency assumption, and only 5 (38 %) of the 13 IPD-NMAs used statistical approaches. The median number of trials included per network was 10 (IQR 4-19) (IPD-NMA: 15 [IQR 8-20]; MAIC: 2 [IQR 3-5]), and the median number of IPD trials included in a network was 3 (IQR 1-9) (IPD-NMA: 6 [IQR 2-11]; MAIC: 2 [IQR 1-2]). Half of the networks (17; 52 %) applied Bayesian hierarchical models (14 one-stage, 1 two-stage, 1 used IPD as an informative prior, 1 unclear-stage), including either IPD alone or with aggregated data (AD). Models for dichotomous and continuous outcomes were available (IPD alone or combined with AD), as were models for time-to-event data (IPD combined with AD). One in three indirect comparison methods modeling IPD adjusted results from different trials to estimate effects as if they had come from the same, randomized, population. Key methodological and reporting elements (e.g., evaluation of consistency, existence of study protocol) were often missing from an indirect comparison paper.
NASA Astrophysics Data System (ADS)
Clark, Martyn; Essery, Richard
2017-04-01
When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.
Modeling evaporation from spent nuclear fuel storage pools: A diffusion approach
NASA Astrophysics Data System (ADS)
Hugo, Bruce Robert
Accurate prediction of evaporative losses from light water reactor nuclear power plant (NPP) spent fuel storage pools (SFPs) is important for activities ranging from sizing of water makeup systems during NPP design to predicting the time available to supply emergency makeup water following severe accidents. Existing correlations for predicting evaporation from water surfaces are only optimized for conditions typical of swimming pools. This new approach modeling evaporation as a diffusion process has yielded an evaporation rate model that provided a better fit of published high temperature evaporation data and measurements from two SFPs than other published evaporation correlations. Insights from treating evaporation as a diffusion process include correcting for the effects of air flow and solutes on evaporation rate. An accurate modeling of the effects of air flow on evaporation rate is required to explain the observed temperature data from the Fukushima Daiichi Unit 4 SFP during the 2011 loss of cooling event; the diffusion model of evaporation provides a significantly better fit to this data than existing evaporation models.
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-01-01
A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.
Size effects in non-linear heat conduction with flux-limited behaviors
NASA Astrophysics Data System (ADS)
Li, Shu-Nan; Cao, Bing-Yang
2017-11-01
Size effects are discussed for several non-linear heat conduction models with flux-limited behaviors, including the phonon hydrodynamic, Lagrange multiplier, hierarchy moment, nonlinear phonon hydrodynamic, tempered diffusion, thermon gas and generalized nonlinear models. For the phonon hydrodynamic, Lagrange multiplier and tempered diffusion models, heat flux will not exist in problems with sufficiently small scale. The existence of heat flux needs the sizes of heat conduction larger than their corresponding critical sizes, which are determined by the physical properties and boundary temperatures. The critical sizes can be regarded as the theoretical limits of the applicable ranges for these non-linear heat conduction models with flux-limited behaviors. For sufficiently small scale heat conduction, the phonon hydrodynamic and Lagrange multiplier models can also predict the theoretical possibility of violating the second law and multiplicity. Comparisons are also made between these non-Fourier models and non-linear Fourier heat conduction in the type of fast diffusion, which can also predict flux-limited behaviors.
A Model of Consumer Decision Making in the Selection of a Long-Term Care Facility.
ERIC Educational Resources Information Center
Neugroschel, William J.; Notzon, Linda R.
Since nursing home placement is frequently the last choice for families of elderly people who need long-term care, little literature exists which delineates a model for consumer decision making in the selection of a specific long-term care facility. Critical issues include the following: (1) who actually makes the selection; (2) what other…
Scalable Database Design of End-Game Model with Decoupled Countermeasure and Threat Information
2017-11-01
Threat Information by Decetria Akole and Michael Chen Approved for public release; distribution is unlimited...Scalable Database Design of End-Game Model with Decoupled Countermeasure and Threat Information by Decetria Akole The Thurgood Marshall...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data
An improved canopy wind model for predicting wind adjustment factors and wildland fire behavior
W. J. Massman; J. M. Forthofer; M. A. Finney
2017-01-01
The ability to rapidly estimate wind speed beneath a forest canopy or near the ground surface in any vegetation is critical to practical wildland fire behavior models. The common metric of this wind speed is the "mid-flame" wind speed, UMF. However, the existing approach for estimating UMF has some significant shortcomings. These include the assumptions that...
Assessment of a Solar System Walk
ERIC Educational Resources Information Center
LoPresto, Michael C.; Murrell, Steven R.; Kirchner, Brian
2010-01-01
The idea of sending students and the general public on a walk through a scale model of the solar system in an attempt to instill an appreciation of the relative scales of the sizes of the objects compared to the immense distances between them is certainly not new. A good number of such models exist, including one on the National Mall in…
Cybernetic integration of experiments into the CVT system
NASA Technical Reports Server (NTRS)
Helvey, T. C.
1972-01-01
The research to develop a cybernetic model which is a static aggregate of the existing interaction in the CVT is reported. The experiments involving man considered necessary for cybernetic integration are listed. Topics discussed include: the modeling dynamic interactions for two competing systems; aspects of man-man integration in the CVT; and establishment of optimum number of research crew for the CVT.
A critique of the cross-lagged panel model.
Hamaker, Ellen L; Kuiper, Rebecca M; Grasman, Raoul P P P
2015-03-01
The cross-lagged panel model (CLPM) is believed by many to overcome the problems associated with the use of cross-lagged correlations as a way to study causal influences in longitudinal panel data. The current article, however, shows that if stability of constructs is to some extent of a trait-like, time-invariant nature, the autoregressive relationships of the CLPM fail to adequately account for this. As a result, the lagged parameters that are obtained with the CLPM do not represent the actual within-person relationships over time, and this may lead to erroneous conclusions regarding the presence, predominance, and sign of causal influences. In this article we present an alternative model that separates the within-person process from stable between-person differences through the inclusion of random intercepts, and we discuss how this model is related to existing structural equation models that include cross-lagged relationships. We derive the analytical relationship between the cross-lagged parameters from the CLPM and the alternative model, and use simulations to demonstrate the spurious results that may arise when using the CLPM to analyze data that include stable, trait-like individual differences. We also present a modeling strategy to avoid this pitfall and illustrate this using an empirical data set. The implications for both existing and future cross-lagged panel research are discussed. (c) 2015 APA, all rights reserved).
2017-03-20
computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
The Impact of Prior Deployment Experience on Civilian Employment After Military Service
2013-03-21
covariates men- tioned. Given the exploratory nature of this study, all defined variables were included. Model diagnostic tests were conducted and we...assessed model fit using the Hosmer–Lemeshow goodness-of-fit test . To identify the existence of collinearity, we examined all variance inflation factors...separation, and reason for separation and service branch were tested . Both interactions were significant at pɘ.10. Three models were built to examine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burrows, Susannah M.; Ogunro, O.; Frossard, Amanda
2014-12-19
The presence of a large fraction of organic matter in primary sea spray aerosol (SSA) can strongly affect its cloud condensation nuclei activity and interactions with marine clouds. Global climate models require new parameterizations of the SSA composition in order to improve the representation of these processes. Existing proposals for such a parameterization use remotely-sensed chlorophyll-a concentrations as a proxy for the biogenic contribution to the aerosol. However, both observations and theoretical considerations suggest that existing relationships with chlorophyll-a, derived from observations at only a few locations, may not be representative for all ocean regions. We introduce a novel frameworkmore » for parameterizing the fractionation of marine organic matter into SSA based on a competitive Langmuir adsorption equilibrium at bubble surfaces. Marine organic matter is partitioned into classes with differing molecular weights, surface excesses, and Langmuir adsorption parameters. The classes include a lipid-like mixture associated with labile dissolved organic carbon (DOC), a polysaccharide-like mixture associated primarily with semi-labile DOC, a protein-like mixture with concentrations intermediate between lipids and polysaccharides, a processed mixture associated with recalcitrant surface DOC, and a deep abyssal humic-like mixture. Box model calculations have been performed for several cases of organic adsorption to illustrate the underlying concepts. We then apply the framework to output from a global marine biogeochemistry model, by partitioning total dissolved organic carbon into several classes of macromolecule. Each class is represented by model compounds with physical and chemical properties based on existing laboratory data. This allows us to globally map the predicted organic mass fraction of the nascent submicron sea spray aerosol. Predicted relationships between chlorophyll-\\textit{a} and organic fraction are similar to existing empirical parameterizations, but can vary between biologically productive and non-productive regions, and seasonally within a given region. Major uncertainties include the bubble film thickness at bursting and the variability of organic surfactant activity in the ocean, which is poorly constrained. In addition, marine colloids and cooperative adsorption of polysaccharides may make important contributions to the aerosol, but are not included here. This organic fractionation framework is an initial step towards a closer linking of ocean biogeochemistry and aerosol chemical composition in Earth system models. Future work should focus on improving constraints on model parameters through new laboratory experiments or through empirical fitting to observed relationships in the real ocean and atmosphere, as well as on atmospheric implications of the variable composition of organic matter in sea spray.« less
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Adaptive Modeling of the International Space Station Electrical Power System
NASA Technical Reports Server (NTRS)
Thomas, Justin Ray
2007-01-01
Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wnek, W.J.; Ramshaw, J.D.; Trapp, J.A.
1975-11-01
A mathematical model and a numerical solution scheme for thermal- hydraulic analysis of fuel rod arrays are given. The model alleviates the two major deficiencies associated with existing rod array analysis models, that of a correct transverse momentum equation and the capability of handling reversing and circulatory flows. Possible applications of the model include steady state and transient subchannel calculations as well as analysis of flows in heat exchangers, other engineering equipment, and porous media. (auth)
Nonconvex Model of Material Growth: Mathematical Theory
NASA Astrophysics Data System (ADS)
Ganghoffer, J. F.; Plotnikov, P. I.; Sokolowski, J.
2018-06-01
The model of volumetric material growth is introduced in the framework of finite elasticity. The new results obtained for the model are presented with complete proofs. The state variables include the deformations, temperature and the growth factor matrix function. The existence of global in time solutions for the quasistatic deformations boundary value problem coupled with the energy balance and the evolution of the growth factor is shown. The mathematical results can be applied to a wide class of growth models in mechanics and biology.
Liu, Ye; Gill, Elisabeth; Shery Huang, Yan Yan
2017-01-01
A plethora of 3D and microfluidics-based culture models have been demonstrated in the recent years with the ultimate aim to facilitate predictive in vitro models for pharmaceutical development. This article summarizes to date the progress in the microfluidics-based tissue culture models, including organ-on-a-chip and vasculature-on-a-chip. Specific focus is placed on addressing the question of what kinds of 3D culture and system complexities are deemed desirable by the biological and biomedical community. This question is addressed through analysis of a research survey to evaluate the potential use of microfluidic cell culture models among the end users. Our results showed a willingness to adopt 3D culture technology among biomedical researchers, although a significant gap still exists between the desired systems and existing 3D culture options. With these results, key challenges and future directions are highlighted. PMID:28670465
Liu, Ye; Gill, Elisabeth; Shery Huang, Yan Yan
2017-06-01
A plethora of 3D and microfluidics-based culture models have been demonstrated in the recent years with the ultimate aim to facilitate predictive in vitro models for pharmaceutical development. This article summarizes to date the progress in the microfluidics-based tissue culture models, including organ-on-a-chip and vasculature-on-a-chip. Specific focus is placed on addressing the question of what kinds of 3D culture and system complexities are deemed desirable by the biological and biomedical community. This question is addressed through analysis of a research survey to evaluate the potential use of microfluidic cell culture models among the end users. Our results showed a willingness to adopt 3D culture technology among biomedical researchers, although a significant gap still exists between the desired systems and existing 3D culture options. With these results, key challenges and future directions are highlighted.
The Dark Matter Crisis: Falsification of the Current Standard Model of Cosmology
NASA Astrophysics Data System (ADS)
Kroupa, P.
2012-06-01
The current standard model of cosmology (SMoC) requires The Dual Dwarf Galaxy Theorem to be true according to which two types of dwarf galaxies must exist: primordial dark-matter (DM) dominated (type A) dwarf galaxies, and tidal-dwarf and ram-pressure-dwarf (type B) galaxies void of DM. Type A dwarfs surround the host approximately spherically, while type B dwarfs are typically correlated in phase-space. Type B dwarfs must exist in any cosmological theory in which galaxies interact. Only one type of dwarf galaxy is observed to exist on the baryonic Tully-Fisher plot and in the radius-mass plane. The Milky Way satellite system forms a vast phase-space-correlated structure that includes globular clusters and stellar and gaseous streams. Other galaxies also have phase-space correlated satellite systems. Therefore, The Dual Dwarf Galaxy Theorem is falsified by observation and dynamically relevant cold or warm DM cannot exist. It is shown that the SMoC is incompatible with a large set of other extragalactic observations. Other theoretical solutions to cosmological observations exist. In particular, alone the empirical mass-discrepancy-acceleration correlation constitutes convincing evidence that galactic-scale dynamics must be Milgromian. Major problems with inflationary big bang cosmologies remain unresolved.
Choice Rules and Accumulator Networks
2015-01-01
This article presents a preference accumulation model that can be used to implement a number of different multi-attribute heuristic choice rules, including the lexicographic rule, the majority of confirming dimensions (tallying) rule and the equal weights rule. The proposed model differs from existing accumulators in terms of attribute representation: Leakage and competition, typically applied only to preference accumulation, are also assumed to be involved in processing attribute values. This allows the model to perform a range of sophisticated attribute-wise comparisons, including comparisons that compute relative rank. The ability of a preference accumulation model composed of leaky competitive networks to mimic symbolic models of heuristic choice suggests that these 2 approaches are not incompatible, and that a unitary cognitive model of preferential choice, based on insights from both these approaches, may be feasible. PMID:28670592
ENKI - An Open Source environmental modelling platfom
NASA Astrophysics Data System (ADS)
Kolberg, S.; Bruland, O.
2012-04-01
The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface, making it possible to run ENKI from GIS programs and other software environments. ENKI currently compiles under Windows and Visual Studio only, but ambitions exist to remove the platform and compiler dependencies.
NASA Astrophysics Data System (ADS)
Bloch, Jean-Jacques
The Arabsat and Eutelsat systems are described. The Arabsat belongs to an organization which includes 20 countries of the Arab League. The Eutelsat belongs to the European telecommunication system which includes 29 countries, and is based op the Intelsat model. The current use of their payload is reviewed and compared with their respective planning stage predictions. From this perspective, some teachings are drawn which could be profitable for emerging region Pacific basin networks, now in the planning stage. In the Pacific basin several private and governmental regional satellite networks either newly existing or in the design phase are vying to deliver services to potential customers. These services include national television, commercial television, VSAT (Very Small Aperture Terminal) networks, and regional or domestic telephony.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
... includes the operation of 355 existing wind turbine generators during the first three phases of the project... of up to 448 wind turbines. The project would also include all associated collector lines, access... areas and concrete batch plants, if applicable. Up to five different models of wind turbines may be in...
NASA Astrophysics Data System (ADS)
Masson, V.; Le Moigne, P.; Martin, E.; Faroux, S.; Alias, A.; Alkama, R.; Belamari, S.; Barbu, A.; Boone, A.; Bouyssel, F.; Brousseau, P.; Brun, E.; Calvet, J.-C.; Carrer, D.; Decharme, B.; Delire, C.; Donier, S.; Essaouini, K.; Gibelin, A.-L.; Giordani, H.; Habets, F.; Jidane, M.; Kerdraon, G.; Kourzeneva, E.; Lafaysse, M.; Lafont, S.; Lebeaupin Brossier, C.; Lemonsu, A.; Mahfouf, J.-F.; Marguinaud, P.; Mokhtari, M.; Morin, S.; Pigeon, G.; Salgado, R.; Seity, Y.; Taillefer, F.; Tanguy, G.; Tulet, P.; Vincendon, B.; Vionnet, V.; Voldoire, A.
2013-07-01
SURFEX is a new externalized land and ocean surface platform that describes the surface fluxes and the evolution of four types of surfaces: nature, town, inland water and ocean. It is mostly based on pre-existing, well-validated scientific models that are continuously improved. The motivation for the building of SURFEX is to use strictly identical scientific models in a high range of applications in order to mutualise the research and development efforts. SURFEX can be run in offline mode (0-D or 2-D runs) or in coupled mode (from mesoscale models to numerical weather prediction and climate models). An assimilation mode is included for numerical weather prediction and monitoring. In addition to momentum, heat and water fluxes, SURFEX is able to simulate fluxes of carbon dioxide, chemical species, continental aerosols, sea salt and snow particles. The main principles of the organisation of the surface are described first. Then, a survey is made of the scientific module (including the coupling strategy). Finally, the main applications of the code are summarised. The validation work undertaken shows that replacing the pre-existing surface models by SURFEX in these applications is usually associated with improved skill, as the numerous scientific developments contained in this community code are used to good advantage.
Sharma, Nripen S.; Jindal, Rohit; Mitra, Bhaskar; Lee, Serom; Li, Lulu; Maguire, Tim J.; Schloss, Rene; Yarmush, Martin L.
2014-01-01
Skin sensitization remains a major environmental and occupational health hazard. Animal models have been used as the gold standard method of choice for estimating chemical sensitization potential. However, a growing international drive and consensus for minimizing animal usage have prompted the development of in vitro methods to assess chemical sensitivity. In this paper, we examine existing approaches including in silico models, cell and tissue based assays for distinguishing between sensitizers and irritants. The in silico approaches that have been discussed include Quantitative Structure Activity Relationships (QSAR) and QSAR based expert models that correlate chemical molecular structure with biological activity and mechanism based read-across models that incorporate compound electrophilicity. The cell and tissue based assays rely on an assortment of mono and co-culture cell systems in conjunction with 3D skin models. Given the complexity of allergen induced immune responses, and the limited ability of existing systems to capture the entire gamut of cellular and molecular events associated with these responses, we also introduce a microfabricated platform that can capture all the key steps involved in allergic contact sensitivity. Finally, we describe the development of an integrated testing strategy comprised of two or three tier systems for evaluating sensitization potential of chemicals. PMID:24741377
Investigating FP Tau’s protoplanetary disk structure through modeling
NASA Astrophysics Data System (ADS)
Brinjikji, Marah; Espaillat, Catherine
2017-01-01
This project presents a study aiming to understand the structure of the protoplanetary disk around FP Tau, a very young, very low mass star in the Taurus star-forming region. We have gathered existing optical, Spitzer, Herschel and submillimeter observations to construct the spectral energy distribution (SED) of FP Tau. We have used the D’Alessio et al (2006) physically self-consistent irradiated accretion disk model including dust settling to model the disk of FP Tau. Using this method, the best fit for the SED of FP Tau is a model that includes a gap located 10-20 AU away from the star. This gap is filled with optically thin dust that separates the optically thick dust in the outer disk from the optically thick dust in the inner disk. These characteristics indicate that FP Tau’s protostellar system is best classified as a pre-transitional disk. Near-infrared interferometry in the K-Band from Willson et al 2016 indicates that FP Tau has a small gap located 10-20 AU from the star, which is consistent with the model we produced, lending further support to the pre-transitional disk interpretation. The most likely explanation for the existence of a gap in the disk is a forming planet.
Modeling the Galaxy-Halo Connection: An open-source approach with Halotools
NASA Astrophysics Data System (ADS)
Hearin, Andrew
2016-03-01
Although the modern form of galaxy-halo modeling has been in place for over ten years, there exists no common code base for carrying out large-scale structure calculations. Considering, for example, the advances in CMB science made possible by Boltzmann-solvers such as CMBFast, CAMB and CLASS, there are clear precedents for how theorists working in a well-defined subfield can mutually benefit from such a code base. Motivated by these and other examples, I present Halotools: an open-source, object-oriented python package for building and testing models of the galaxy-halo connection. Halotools is community-driven, and already includes contributions from over a dozen scientists spread across numerous universities. Designed with high-speed performance in mind, the package generates mock observations of synthetic galaxy populations with sufficient speed to conduct expansive MCMC likelihood analyses over a diverse and highly customizable set of models. The package includes an automated test suite and extensive web-hosted documentation and tutorials (halotools.readthedocs.org). I conclude the talk by describing how Halotools can be used to analyze existing datasets to obtain robust and novel constraints on galaxy evolution models, and by outlining the Halotools program to prepare the field of cosmology for the arrival of Stage IV dark energy experiments.
Comparing mechanistic and empirical approaches to modeling the thermal niche of almond
NASA Astrophysics Data System (ADS)
Parker, Lauren E.; Abatzoglou, John T.
2017-09-01
Delineating locations that are thermally viable for cultivating high-value crops can help to guide land use planning, agronomics, and water management. Three modeling approaches were used to identify the potential distribution and key thermal constraints on on almond cultivation across the southwestern United States (US), including two empirical species distribution models (SDMs)—one using commonly used bioclimatic variables (traditional SDM) and the other using more physiologically relevant climate variables (nontraditional SDM)—and a mechanistic model (MM) developed using published thermal limitations from field studies. While models showed comparable results over the majority of the domain, including over existing croplands with high almond density, the MM suggested the greatest potential for the geographic expansion of almond cultivation, with frost susceptibility and insufficient heat accumulation being the primary thermal constraints in the southwestern US. The traditional SDM over-predicted almond suitability in locations shown by the MM to be limited by frost, whereas the nontraditional SDM showed greater agreement with the MM in these locations, indicating that incorporating physiologically relevant variables in SDMs can improve predictions. Finally, opportunities for geographic expansion of almond cultivation under current climatic conditions in the region may be limited, suggesting that increasing production may rely on agronomical advances and densifying current almond plantations in existing locations.
DOT National Transportation Integrated Search
2003-01-01
This study evaluated existing traffic signal optimization programs including Synchro,TRANSYT-7F, and genetic algorithm optimization using real-world data collected in Virginia. As a first step, a microscopic simulation model, VISSIM, was extensively ...
A Mathematical Model of Anthrax Transmission in Animal Populations.
Saad-Roy, C M; van den Driessche, P; Yakubu, Abdul-Aziz
2017-02-01
A general mathematical model of anthrax (caused by Bacillus anthracis) transmission is formulated that includes live animals, infected carcasses and spores in the environment. The basic reproduction number [Formula: see text] is calculated, and existence of a unique endemic equilibrium is established for [Formula: see text] above the threshold value 1. Using data from the literature, elasticity indices for [Formula: see text] and type reproduction numbers are computed to quantify anthrax control measures. Including only herbivorous animals, anthrax is eradicated if [Formula: see text]. For these animals, oscillatory solutions arising from Hopf bifurcations are numerically shown to exist for certain parameter values with [Formula: see text] and to have periodicity as observed from anthrax data. Including carnivores and assuming no disease-related death, anthrax again goes extinct below the threshold. Local stability of the endemic equilibrium is established above the threshold; thus, periodic solutions are not possible for these populations. It is shown numerically that oscillations in spore growth may drive oscillations in animal populations; however, the total number of infected animals remains about the same as with constant spore growth.
Observations and modeling of San Diego beaches during El Niño
NASA Astrophysics Data System (ADS)
Doria, André; Guza, R. T.; O'Reilly, William C.; Yates, M. L.
2016-08-01
Subaerial sand levels were observed at five southern California beaches for 16 years, including notable El Niños in 1997-98 and 2009-10. An existing, empirical shoreline equilibrium model, driven with wave conditions estimated using a regional buoy network, simulates well the seasonal changes in subaerial beach width (e.g. the cross-shore location of the MSL contour) during non-El Niño years, similar to previous results with a 5-year time series lacking an El Niño winter. The existing model correctly identifies the 1997-98 El Niño winter conditions as more erosive than 2009-10, but overestimates shoreline erosion during both El Niños. The good skill of the existing equilibrium model in typical conditions does not necessarily extrapolate to extreme erosion on these beaches where a few meters thick sand layer often overlies more resistant layers. The modest over-prediction of the 2009-10 El Niño is reduced by gradually decreasing the model mobility of highly eroded shorelines (simulating cobbles, kelp wrack, shell hash, or other stabilizing layers). Over prediction during the more severe 1997-98 El Niño is corrected by stopping model erosion when resilient surfaces (identified with aerial imagery) are reached. The trained model provides a computationally simple (e.g. nonlinear first order differential equation) representation of the observed relationship between incident waves and shoreline change.
Automated Measurement and Verification and Innovative Occupancy Detection Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Phillip; Bruce, Nordman; Piette, Mary Ann
In support of DOE’s sensors and controls research, the goal of this project is to move toward integrated building to grid systems by building on previous work to develop and demonstrate a set of load characterization measurement and evaluation tools that are envisioned to be part of a suite of applications for transactive efficient buildings, built upon data-driven load characterization and prediction models. This will include the ability to include occupancy data in the models, plus data collection and archival methods to include different types of occupancy data with existing networks and a taxonomy for naming these data within amore » Volttron agent platform.« less
Building occupancy simulation and data assimilation using a graph-based agent-oriented model
NASA Astrophysics Data System (ADS)
Rai, Sanish; Hu, Xiaolin
2018-07-01
Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.
LDEF data: Comparisons with existing models
NASA Astrophysics Data System (ADS)
Coombs, Cassandra R.; Watts, Alan J.; Wagner, John D.; Atkinson, Dale R.
1993-04-01
The relationship between the observed cratering impact damage on the Long Duration Exposure Facility (LDEF) versus the existing models for both the natural environment of micrometeoroids and the man-made debris was investigated. Experimental data was provided by several LDEF Principal Investigators, Meteoroid and Debris Special Investigation Group (M&D SIG) members, and by the Kennedy Space Center Analysis Team (KSC A-Team) members. These data were collected from various aluminum materials around the LDEF satellite. A PC (personal computer) computer program, SPENV, was written which incorporates the existing models of the Low Earth Orbit (LEO) environment. This program calculates the expected number of impacts per unit area as functions of altitude, orbital inclination, time in orbit, and direction of the spacecraft surface relative to the velocity vector, for both micrometeoroids and man-made debris. Since both particle models are couched in terms of impact fluxes versus impactor particle size, and much of the LDEF data is in the form of crater production rates, scaling laws have been used to relate the two. Also many hydrodynamic impact computer simulations were conducted, using CTH, of various impact events, that identified certain modes of response, including simple metallic target cratering, perforations and delamination effects of coatings.
NASA Astrophysics Data System (ADS)
Bagherinejad, Jafar; Niknam, Azar
2018-03-01
In this paper, a leader-follower competitive facility location problem considering the reactions of the competitors is studied. A model for locating new facilities and determining levels of quality for the facilities of the leader firm is proposed. Moreover, changes in the location and quality of existing facilities in a competitive market where a competitor offers the same goods or services are taken into account. The competitor could react by opening new facilities, closing existing ones, and adjusting the quality levels of its existing facilities. The market share, captured by each facility, depends on its distance to customer and its quality that is calculated based on the probabilistic Huff's model. Each firm aims to maximize its profit subject to constraints on quality levels and budget of setting up new facilities. This problem is formulated as a bi-level mixed integer non-linear model. The model is solved using a combination of Tabu Search with an exact method. The performance of the proposed algorithm is compared with an upper bound that is achieved by applying Karush-Kuhn-Tucker conditions. Computational results show that our algorithm finds near the upper bound solutions in a reasonable time.
LDEF data: Comparisons with existing models
NASA Technical Reports Server (NTRS)
Coombs, Cassandra R.; Watts, Alan J.; Wagner, John D.; Atkinson, Dale R.
1993-01-01
The relationship between the observed cratering impact damage on the Long Duration Exposure Facility (LDEF) versus the existing models for both the natural environment of micrometeoroids and the man-made debris was investigated. Experimental data was provided by several LDEF Principal Investigators, Meteoroid and Debris Special Investigation Group (M&D SIG) members, and by the Kennedy Space Center Analysis Team (KSC A-Team) members. These data were collected from various aluminum materials around the LDEF satellite. A PC (personal computer) computer program, SPENV, was written which incorporates the existing models of the Low Earth Orbit (LEO) environment. This program calculates the expected number of impacts per unit area as functions of altitude, orbital inclination, time in orbit, and direction of the spacecraft surface relative to the velocity vector, for both micrometeoroids and man-made debris. Since both particle models are couched in terms of impact fluxes versus impactor particle size, and much of the LDEF data is in the form of crater production rates, scaling laws have been used to relate the two. Also many hydrodynamic impact computer simulations were conducted, using CTH, of various impact events, that identified certain modes of response, including simple metallic target cratering, perforations and delamination effects of coatings.
The Dynamical Behaviors for a Class of Immunogenic Tumor Model with Delay
Muthoni, Mutei Damaris; Pang, Jianhua
2017-01-01
This paper aims at studying the model proposed by Kuznetsov and Taylor in 1994. Inspired by Mayer et al., time delay is introduced in the general model. The dynamic behaviors of this model are studied, which include the existence and stability of the equilibria and Hopf bifurcation of the model with discrete delays. The properties of the bifurcated periodic solutions are studied by using the normal form on the center manifold. Numerical examples and simulations are given to illustrate the bifurcation analysis and the obtained results. PMID:29312457
Aspect-Oriented Business Process Modeling with AO4BPMN
NASA Astrophysics Data System (ADS)
Charfi, Anis; Müller, Heiko; Mezini, Mira
Many crosscutting concerns in business processes need to be addressed already at the business process modeling level such as compliance, auditing, billing, and separation of duties. However, existing business process modeling languages including OMG's Business Process Modeling Notation (BPMN) lack appropriate means for expressing such concerns in a modular way. In this paper, we motivate the need for aspect-oriented concepts in business process modeling languages and propose an aspect-oriented extension to BPMN called AO4BPMN. We also present a graphical editor supporting that extension.
Gas Hydrate Petroleum System Modeling in western Nankai Trough Area
NASA Astrophysics Data System (ADS)
Tanaka, M.; Aung, T. T.; Fujii, T.; Wada, N.; Komatsu, Y.
2017-12-01
Since 2003, we have been conducting Gas Hydrate (GH) petroleum system models covering the eastern Nankai Trough, Japan, and results of resource potential from regional model shows good match with the value depicted from seismic and log data. In this year, we have applied this method to explore GH potential in study area. In our study area, GH prospects have been identified with aid of bottom simulating reflector (BSR) and presence of high velocity anomalies above the BSR interpreted based on 3D migration seismic and high density velocity cubes. In order to understand the pathway of biogenic methane from source to GH prospects 1D-2D-3D GH petroleum system models are built and investigated. This study comprises lower Miocene to Pleistocene, deep to shallow marine sedimentary successions of Pliocene and Pleistocene layers overlain the basement. The BSR were interpreted in Pliocene and Pleistocene layers. Based on 6 interpreted sequence boundaries from 3D migration seismic and velocity data, construction of a depth 3D framework model is made and distributed by a conceptual submarine fan depositional facies model derived from seismic facies analysis and referring existing geological report. 1D models are created to analyze lithology sensitivity to temperature and vitrinite data from an exploratory well drilled in the vicinity of study area. The PSM parameters are applied in 2D and 3D modeling and simulation. Existing report of the explanatory well reveals that thermogenic origin are considered to exist. For this reason, simulation scenarios including source formations for both biogenic and thermogenic reaction models are also investigated. Simulation results reveal lower boundary of GH saturation zone at pseudo wells has been simulated with sensitivity of a few tens of meters in comparing with interpreted BSR. From sensitivity analysis, simulated temperature was controlled by different peak generation temperature models and geochemical parameters. Progressive folding and updipping layers including paleostructure can effectively assist biogenic gas migration to upward. Biogenic and Thermogenic mixing model shows that kitchen center only has a potential for generating thermogenic hydrocarbon. Our Prospect based on seismic interpretation is consistent with high GH saturation area based on 3D modeling results.
NASA Technical Reports Server (NTRS)
Goett, Harry J; Delaney, Noel K
1944-01-01
Report presents the results of tests of a model of a single-engine airplane with two different tilts of the propeller axis. The results indicate that on a typical design a 5 degree downward tilt of the propeller axis will considerably reduce the destabilization effects of power. A comparison of the experimental results with those computed by use of existing theory is included. A comparison of the experimental results with those computed by use of existing theory is included. It is shown that the results can be predicted with an accuracy acceptable for preliminary design purposes, particularly at the higher powers where the effects are of significant magnitude.
A geographic data model for representing ground water systems.
Strassberg, Gil; Maidment, David R; Jones, Norm L
2007-01-01
The Arc Hydro ground water data model is a geographic data model for representing spatial and temporal ground water information within a geographic information system (GIS). The data model is a standardized representation of ground water systems within a spatial database that provides a public domain template for GIS users to store, document, and analyze commonly used spatial and temporal ground water data sets. This paper describes the data model framework, a simplified version of the complete ground water data model that includes two-dimensional and three-dimensional (3D) object classes for representing aquifers, wells, and borehole data, and the 3D geospatial context in which these data exist. The framework data model also includes tabular objects for representing temporal information such as water levels and water quality samples that are related with spatial features.
Integrated workflows for spiking neuronal network simulations
Antolík, Ján; Davison, Andrew P.
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902
Integrated workflows for spiking neuronal network simulations.
Antolík, Ján; Davison, Andrew P
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.
New chiral fermions, a new gauge interaction, Dirac neutrinos, and dark matter
de Gouvea, Andre; Hernandez, Daniel
2015-10-07
Here, we propose that all light fermionic degrees of freedom, including the Standard Model (SM) fermions and all possible light beyond-the-standard-model fields, are chiral with respect to some spontaneously broken abelian gauge symmetry. Hypercharge, for example, plays this role for the SM fermions. We introduce a new symmetry, U(1) ν , for all new light fermionic states. Anomaly cancellations mandate the existence of several new fermion fields with nontrivial U(1) ν charges. We develop a concrete model of this type, for which we show that (i) some fermions remain massless after U(1) ν breaking — similar to SM neutrinos —more » and (ii) accidental global symmetries translate into stable massive particles — similar to SM protons. These ingredients provide a solution to the dark matter and neutrino mass puzzles assuming one also postulates the existence of heavy degrees of freedom that act as “mediators” between the two sectors. The neutrino mass mechanism described here leads to parametrically small Dirac neutrino masses, and the model also requires the existence of at least four Dirac sterile neutrinos. Finally, we describe a general technique to write down chiral-fermions-only models that are at least anomaly-free under a U(1) gauge symmetry.« less
New chiral fermions, a new gauge interaction, Dirac neutrinos, and dark matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Gouvea, Andre; Hernandez, Daniel
Here, we propose that all light fermionic degrees of freedom, including the Standard Model (SM) fermions and all possible light beyond-the-standard-model fields, are chiral with respect to some spontaneously broken abelian gauge symmetry. Hypercharge, for example, plays this role for the SM fermions. We introduce a new symmetry, U(1) ν , for all new light fermionic states. Anomaly cancellations mandate the existence of several new fermion fields with nontrivial U(1) ν charges. We develop a concrete model of this type, for which we show that (i) some fermions remain massless after U(1) ν breaking — similar to SM neutrinos —more » and (ii) accidental global symmetries translate into stable massive particles — similar to SM protons. These ingredients provide a solution to the dark matter and neutrino mass puzzles assuming one also postulates the existence of heavy degrees of freedom that act as “mediators” between the two sectors. The neutrino mass mechanism described here leads to parametrically small Dirac neutrino masses, and the model also requires the existence of at least four Dirac sterile neutrinos. Finally, we describe a general technique to write down chiral-fermions-only models that are at least anomaly-free under a U(1) gauge symmetry.« less
Sociodemographic Factors Associated With Changes in Successful Aging in Spain: A Follow-Up Study.
Domènech-Abella, Joan; Perales, Jaime; Lara, Elvira; Moneta, Maria Victoria; Izquierdo, Ana; Rico-Uribe, Laura Alejandra; Mundó, Jordi; Haro, Josep Maria
2017-06-01
Successful aging (SA) refers to maintaining well-being in old age. Several definitions or models of SA exist (biomedical, psychosocial, and mixed). We examined the longitudinal association between various SA models and sociodemographic factors, and analyzed the patterns of change within these models. This was a nationally representative follow-up in Spain including 3,625 individuals aged ≥50 years. Some 1,970 individuals were interviewed after 3 years. Linear regression models were used to analyze the survey data. Age, sex, and occupation predicted SA in the biomedical model, while marital status, educational level, and urbanicity predicted SA in the psychosocial model. The remaining models included different sets of these predictors as significant. In the psychosocial model, individuals tended to improve over time but this was not the case in the biomedical model. The biomedical and psychosocial components of SA need to be addressed specifically to achieve the best aging trajectories.
Interplay of the Glass Transition and the Liquid-Liquid Phase Transition in Water
NASA Astrophysics Data System (ADS)
Giovambattista, Nicolas
2013-03-01
Most liquids can form a single glass or amorphous state when cooled sufficiently fast (in order to prevent crystallization). However, there are a few substances that are relevant to scientific and technological applications which can exist in at least two different amorphous states, a property known as polyamorphism. Examples include silicon, silica, and in particular, water. In the case of water, experiments show the existence of a low-density (LDA) and high-density (HDA) amorphous ice that are separated by a dramatic, first-order like phase transition. It has been argued that the LDA-HDA transformation evolves into a first-order liquid-liquid phase transition (LLPT) at temperatures above the glass transition temperature Tg. However, obtaining direct experimental evidence of the LLPT has been challenging since the LLPT occurs at conditions where water rapidly crystallizes. In this talk, I will (i) discuss the general phenomenology of polyamorphism in water and its implications, and (ii) explore the effects of a LLPT on the pressure dependence of Tg(P) for LDA and HDA. Our study is based on computer simulations of two water models - one with a LLPT (ST2 model), and one without (SPC/E model). In the absence of a LLPT, Tg(P) for all glasses nearly coincide. Instead, when there is a LLPT, different glasses exhibit dramatically different Tg(P) loci which are directly linked with the LLPT. Available experimental data for Tg(P) are only consistent with the scenario that includes a LLPT (ST2 model) and hence, our results support the view that a LLPT may exist for the case of water.
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
Included in the report are: (1) review of the erythropoietic mechanisms; (2) an evaluation of existing models for the control of erythropoiesis; (3) a computer simulation of the model's response to hypoxia; (4) an hypothesis to explain observed decreases in red blood cell mass during weightlessness; (5) suggestions for further research; and (6) an assessment of the role that systems analysis can play in the Skylab hematological program.
Efficacy of a surfactant-based wound dressing on biofilm control.
Percival, Steven L; Mayer, Dieter; Salisbury, Anne-Marie
2017-09-01
The aim of this study was to evaluate the efficacy of both a nonantimicrobial and antimicrobial (1% silver sulfadiazine-SSD) surfactant-based wound dressing in the control of Pseudomonas aeruginosa, Enterococcus sp, Staphylococcus epidermidis, Staphylococcus aureus, and methicillin-resistant S. aureus (MRSA) biofilms. Anti-biofilm efficacy was evaluated in numerous adapted American Standards for Testing and Materials (ASTM) standard biofilm models and other bespoke biofilm models. The ASTM standard models employed included the Minimum biofilm eradication concentration (MBEC) biofilm model (ASTM E2799) and the Centers for Disease Control (CDC) biofilm reactor model (ASTM 2871). Such bespoke biofilm models included the filter biofilm model and the chamberslide biofilm model. Results showed complete kill of microorganisms within a biofilm using the antimicrobial surfactant-based wound dressing. Interestingly, the nonantimicrobial surfactant-based dressing could disrupt existing biofilms by causing biofilm detachment. Prior to biofilm detachment, we demonstrated, using confocal laser scanning microscopy (CLSM), the dispersive effect of the nonantimicrobial surfactant-based wound dressing on the biofilm within 10 minutes of treatment. Furthermore, the non-antimicrobial surfactant-based wound dressing caused an increase in microbial flocculation/aggregation, important for microbial concentration. In conclusion, this nonantimicrobial surfactant-based wound dressing leads to the effective detachment and dispersion of in vitro biofilms. The use of surfactant-based wound dressings in a clinical setting may help to disrupt existing biofilm from wound tissue and may increase the action of antimicrobial treatment. © 2017 by the Wound Healing Society.
NASTRAN Modeling of Flight Test Components for UH-60A Airloads Program Test Configuration
NASA Technical Reports Server (NTRS)
Idosor, Florentino R.; Seible, Frieder
1993-01-01
Based upon the recommendations of the UH-60A Airloads Program Review Committee, work towards a NASTRAN remodeling effort has been conducted. This effort modeled and added the necessary structural/mass components to the existing UH-60A baseline NASTRAN model to reflect the addition of flight test components currently in place on the UH-60A Airloads Program Test Configuration used in NASA-Ames Research Center's Modern Technology Rotor Airloads Program. These components include necessary flight hardware such as instrument booms, movable ballast cart, equipment mounting racks, etc. Recent modeling revisions have also been included in the analyses to reflect the inclusion of new and updated primary and secondary structural components (i.e., tail rotor shaft service cover, tail rotor pylon) and improvements to the existing finite element mesh (i.e., revisions of material property estimates). Mode frequency and shape results have shown that components such as the Trimmable Ballast System baseplate and its respective payload ballast have caused a significant frequency change in a limited number of modes while only small percent changes in mode frequency are brought about with the addition of the other MTRAP flight components. With the addition of the MTRAP flight components, update of the primary and secondary structural model, and imposition of the final MTRAP weight distribution, modal results are computed representative of the 'best' model presently available.
Nowinski, Wieslaw L; Thaung, Thant Shoon Let; Chua, Beng Choon; Yi, Su Hnin Wut; Ngai, Vincent; Yang, Yili; Chrzan, Robert; Urbanik, Andrzej
2015-05-15
Although the adult human skull is a complex and multifunctional structure, its 3D, complete, realistic, and stereotactic atlas has not yet been created. This work addresses the construction of a 3D interactive atlas of the adult human skull spatially correlated with the brain, cranial nerves, and intracranial vasculature. The process of atlas construction included computed tomography (CT) high-resolution scan acquisition, skull extraction, skull parcellation, 3D disarticulated bone surface modeling, 3D model simplification, brain-skull registration, 3D surface editing, 3D surface naming and color-coding, integration of the CT-derived 3D bony models with the existing brain atlas, and validation. The virtual skull model created is complete with all 29 bones, including the auditory ossicles (being among the smallest bones). It contains all typical bony features and landmarks. The created skull model is superior to the existing skull models in terms of completeness, realism, and integration with the brain along with blood vessels and cranial nerves. This skull atlas is valuable for medical students and residents to easily get familiarized with the skull and surrounding anatomy with a few clicks. The atlas is also useful for educators to prepare teaching materials. It may potentially serve as a reference aid in the reading and operating rooms. Copyright © 2015 Elsevier B.V. All rights reserved.
Heat Exhaustion in a Rat Model: Lithium as a Biochemical Probe.
1988-11-08
thought to predispose to heat-induced illness include amount of exertion, prior conditioning, pre-existing cardiovascular disease, diabetes mellitus...results in altered distribution of body water, sodium depletion, potassium wasting, polyuria , and abnormal thermoregulation (2,3,8-17). Each of
Topological solitons in 8-spinor mie electrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rybakov, Yu. P., E-mail: soliton4@mail.ru
2013-10-15
We investigate the effective 8-spinor field model suggested earlier as the generalization of nonlinear Mie electrodynamics. We first study in pure spinorial model the existence of topological solitons endowed with the nontrivial Hopf invariant Q{sub H}, which can be interpreted as the lepton number. Electromagnetic field being included as the perturbation, we estimate the energy and the spin of the localized charged configuration.
ERIC Educational Resources Information Center
Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao
2015-01-01
Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…
Fake News, Conspiracy Theories, and Lies: An Information Laundering Model for Homeland Security
2018-03-01
THEORIES, AND LIES: AN INFORMATION LAUNDERING MODEL FOR HOMELAND SECURITY by Samantha M. Korta March 2018 Co-Advisors: Rodrigo Nieto...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden
On a low-dimensional model for magnetostriction
NASA Astrophysics Data System (ADS)
Iyer, R. V.; Manservisi, S.
2006-02-01
In recent years, a low-dimensional model for thin magnetostrictive actuators that incorporated magneto-elastic coupling, inertial and damping effects, ferromagnetic hysteresis and classical eddy current losses was developed using energy-balance principles by Venkataraman and Krishnaprasad. This model, with the classical Preisach operator representing the hysteretic constitutive relation between the magnetic field and magnetization in the axial direction, proved to be very successful in capturing dynamic hysteresis effects with electrical inputs in the 0-50 Hz range and constant mechanical loading. However, it is well known that for soft ferromagnetic materials there exist excess losses in addition to the classical eddy current losses. In this work, we propose to extend the above mentioned model for a magnetostrictive rod actuator by including excess losses via a nonlinear resistive element in the actuator circuit. We then show existence and uniqueness of solutions for the proposed model for electrical voltage input in the space L2(0,T)∩L∞(0,T) and mechanical force input in the space L2(0,T).
NASA Technical Reports Server (NTRS)
Pinho, Silvestre T.; Davila, C. G.; Camanho, P. P.; Iannucci, L.; Robinson, P.
2005-01-01
A set of three-dimensional failure criteria for laminated fiber-reinforced composites, denoted LaRC04, is proposed. The criteria are based on physical models for each failure mode and take into consideration non-linear matrix shear behaviour. The model for matrix compressive failure is based on the Mohr-Coulomb criterion and it predicts the fracture angle. Fiber kinking is triggered by an initial fiber misalignment angle and by the rotation of the fibers during compressive loading. The plane of fiber kinking is predicted by the model. LaRC04 consists of 6 expressions that can be used directly for design purposes. Several applications involving a broad range of load combinations are presented and compared to experimental data and other existing criteria. Predictions using LaRC04 correlate well with the experimental data, arguably better than most existing criteria. The good correlation seems to be attributable to the physical soundness of the underlying failure models.
Kurylyk, Barret L.; McKenzie, Jeffrey M; MacQuarrie, Kerry T. B.; Voss, Clifford I.
2014-01-01
Numerous cold regions water flow and energy transport models have emerged in recent years. Dissimilarities often exist in their mathematical formulations and/or numerical solution techniques, but few analytical solutions exist for benchmarking flow and energy transport models that include pore water phase change. This paper presents a detailed derivation of the Lunardini solution, an approximate analytical solution for predicting soil thawing subject to conduction, advection, and phase change. Fifteen thawing scenarios are examined by considering differences in porosity, surface temperature, Darcy velocity, and initial temperature. The accuracy of the Lunardini solution is shown to be proportional to the Stefan number. The analytical solution results obtained for soil thawing scenarios with water flow and advection are compared to those obtained from the finite element model SUTRA. Three problems, two involving the Lunardini solution and one involving the classic Neumann solution, are recommended as standard benchmarks for future model development and testing.
Brian hears: online auditory processing using vectorization over channels.
Fontaine, Bertrand; Goodman, Dan F M; Benichoux, Victor; Brette, Romain
2011-01-01
The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in "Brian Hears," a library for the spiking neural network simulator package "Brian." This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
Colour Model for Outdoor Machine Vision for Tropical Regions and its Comparison with the CIE Model
NASA Astrophysics Data System (ADS)
Sahragard, Nasrolah; Ramli, Abdul Rahman B.; Hamiruce Marhaban, Mohammad; Mansor, Shattri B.
2011-02-01
Accurate modeling of daylight and surface reflectance are very useful for most outdoor machine vision applications specifically those which are based on color recognition. Existing daylight CIE model has drawbacks that limit its ability to predict the color of incident light. These limitations include lack of considering ambient light, effects of light reflected off the ground, and context specific information. Previously developed color model is only tested for a few geographical places in North America and its accountability is under question for other places in the world. Besides, existing surface reflectance models are not easily applied to outdoor images. A reflectance model with combined diffuse and specular reflection in normalized HSV color space could be used to predict color. In this paper, a new daylight color model showing the color of daylight for a broad range of sky conditions is developed which will suit weather conditions of tropical places such as Malaysia. A comparison of this daylight color model and daylight CIE model will be discussed. The colors of matte and specular surfaces have been estimated by use of the developed color model and surface reflection function in this paper. The results are shown to be highly reliable.
Lepton-number-charged scalars and neutrino beamstrahlung
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berryman, Jeffrey M.; de Gouvea, Andre; Kelly, Kevin J.
Experimentally, baryon number minus lepton number, $B-L$, appears to be a good global symmetry of nature. We explore the consequences of the existence of gauge-singlet scalar fields charged under $B-L$ $-$dubbed lepton-number-charged scalars, LeNCS $-$and postulate that these couple to the standard model degrees of freedom in such a way that $B-L$ is conserved even at the non-renormalizable level. In this framework, neutrinos are Dirac fermions. Including only the lowest mass-dimension effective operators, some of the LeNCS couple predominantly to neutrinos and may be produced in terrestrial neutrino experiments. We examine several existing constraints from particle physics, astrophysics, and cosmologymore » to the existence of a LeNCS carrying $B-L$ charge equal to two, and discuss the emission of LeNCS's via "neutrino beamstrahlung," which occurs every once in a while when neutrinos scatter off of ordinary matter. In conclusion, we identify regions of the parameter space where existing and future neutrino experiments, including the Deep Underground Neutrino Experiment, are at the frontier of searches for such new phenomena.« less
Lepton-number-charged scalars and neutrino beamstrahlung
Berryman, Jeffrey M.; de Gouvea, Andre; Kelly, Kevin J.; ...
2018-04-23
Experimentally, baryon number minus lepton number, $B-L$, appears to be a good global symmetry of nature. We explore the consequences of the existence of gauge-singlet scalar fields charged under $B-L$ $-$dubbed lepton-number-charged scalars, LeNCS $-$and postulate that these couple to the standard model degrees of freedom in such a way that $B-L$ is conserved even at the non-renormalizable level. In this framework, neutrinos are Dirac fermions. Including only the lowest mass-dimension effective operators, some of the LeNCS couple predominantly to neutrinos and may be produced in terrestrial neutrino experiments. We examine several existing constraints from particle physics, astrophysics, and cosmologymore » to the existence of a LeNCS carrying $B-L$ charge equal to two, and discuss the emission of LeNCS's via "neutrino beamstrahlung," which occurs every once in a while when neutrinos scatter off of ordinary matter. In conclusion, we identify regions of the parameter space where existing and future neutrino experiments, including the Deep Underground Neutrino Experiment, are at the frontier of searches for such new phenomena.« less
Lepton-number-charged scalars and neutrino beamstrahlung
NASA Astrophysics Data System (ADS)
Berryman, Jeffrey M.; de Gouvêa, André; Kelly, Kevin J.; Zhang, Yue
2018-04-01
Experimentally, baryon number minus lepton number, B -L , appears to be a good global symmetry of nature. We explore the consequences of the existence of gauge-singlet scalar fields charged under B -L -dubbed lepton-number-charged scalars (LeNCSs)—and postulate that these couple to the standard model degrees of freedom in such a way that B -L is conserved even at the nonrenormalizable level. In this framework, neutrinos are Dirac fermions. Including only the lowest mass-dimension effective operators, some of the LeNCSs couple predominantly to neutrinos and may be produced in terrestrial neutrino experiments. We examine several existing constraints from particle physics, astrophysics, and cosmology to the existence of a LeNCS carrying B -L charge equal to two, and discuss the emission of LeNCSs via "neutrino beamstrahlung," which occurs every once in a while when neutrinos scatter off of ordinary matter. We identify regions of the parameter space where existing and future neutrino experiments, including the Deep Underground Neutrino Experiment, are at the frontier of searches for such new phenomena.
Parallel equilibrium current effect on existence of reversed shear Alfvén eigenmodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Hua-sheng, E-mail: huashengxie@gmail.com; Xiao, Yong, E-mail: yxiao@zju.edu.cn
2015-02-15
A new fast global eigenvalue code, where the terms are segregated according to their physics contents, is developed to study Alfvén modes in tokamak plasmas, particularly, the reversed shear Alfvén eigenmode (RSAE). Numerical calculations show that the parallel equilibrium current corresponding to the kink term is strongly unfavorable for the existence of the RSAE. An improved criterion for the RSAE existence is given for with and without the parallel equilibrium current. In the limits of ideal magnetohydrodynamics (MHD) and zero-pressure, the toroidicity effect is the main possible favorable factor for the existence of the RSAE, which is however usually small.more » This suggests that it is necessary to include additional physics such as kinetic term in the MHD model to overcome the strong unfavorable effect of the parallel current in order to enable the existence of RSAE.« less
Crystal study and econometric model
NASA Technical Reports Server (NTRS)
1975-01-01
An econometric model was developed that can be used to predict demand and supply figures for crystals over a time horizon roughly concurrent with that of NASA's Space Shuttle Program - that is, 1975 through 1990. The model includes an equation to predict the impact on investment in the crystal-growing industry. Actually, two models are presented. The first is a theoretical model which follows rather strictly the standard theoretical economic concepts involved in supply and demand analysis, and a modified version of the model was developed which, though not quite as theoretically sound, was testable utilizing existing data sources.
Models Required to Mitigate Impacts of Space Weather on Space Systems
NASA Technical Reports Server (NTRS)
Barth, Janet L.
2003-01-01
This viewgraph presentation attempts to develop a model of factors which need to be considered in the design and construction of spacecraft to lessen the effects of space weather on these vehicles. Topics considered include: space environments and effects, radiation environments and effects, space weather drivers, space weather models, climate models, solar proton activity and mission design for the GOES mission. The authors conclude that space environment models need to address issues from mission planning through operations and a program to develop and validate authoritative space environment models for application to spacecraft design does not exist at this time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grassberger, C; Paganetti, H
Purpose: To develop a model that includes the process of resistance development into the treatment optimization process for schedules that include targeted therapies. Further, to validate the approach using clinical data and to apply the model to assess the optimal induction period with targeted agents before curative treatment with chemo-radiation in stage III lung cancer. Methods: Growth of the tumor and its subpopulations is modeled by Gompertzian growth dynamics, resistance induction as a stochastic process. Chemotherapy induced cell kill is modeled by log-cell kill dynamics, targeted agents similarly but restricted to the sensitive population. Radiation induced cell kill is assumedmore » to follow the linear-quadratic model. The validation patient data consist of a cohort of lung cancer patients treated with tyrosine kinase inhibitors that had longitudinal imaging data available. Results: The resistance induction model was successfully validated using clinical trial data from 49 patients treated with targeted agents. The observed recurrence kinetics, with tumors progressing from 1.4–63 months, result in tumor growth equaling a median volume doubling time of 92 days [34–248] and a median fraction of pre-existing resistance of 0.035 [0–0.22], in agreement with previous clinical studies. The model revealed widely varying optimal time points for the use of curative therapy, reaching from ∼1m to >6m depending on the patient’s growth rate and amount of pre-existing resistance. This demonstrates the importance of patient-specific treatment schedules when targeted agents are incorporated into the treatment. Conclusion: We developed a model including evolutionary dynamics of resistant sub-populations with traditional chemotherapy and radiation cell kill models. Fitting to clinical data yielded patient specific growth rates and resistant fraction in agreement with previous studies. Further application of the model demonstrated how proper timing of chemo-radiation could minimize the probability of resistance, increasing tumor control significantly.« less
BinQuasi: a peak detection method for ChIP-sequencing data with biological replicates.
Goren, Emily; Liu, Peng; Wang, Chao; Wang, Chong
2018-04-19
ChIP-seq experiments that are aimed at detecting DNA-protein interactions require biological replication to draw inferential conclusions, however there is no current consensus on how to analyze ChIP-seq data with biological replicates. Very few methodologies exist for the joint analysis of replicated ChIP-seq data, with approaches ranging from combining the results of analyzing replicates individually to joint modeling of all replicates. Combining the results of individual replicates analyzed separately can lead to reduced peak classification performance compared to joint modeling. Currently available methods for joint analysis may fail to control the false discovery rate at the nominal level. We propose BinQuasi, a peak caller for replicated ChIP-seq data, that jointly models biological replicates using a generalized linear model framework and employs a one-sided quasi-likelihood ratio test to detect peaks. When applied to simulated data and real datasets, BinQuasi performs favorably compared to existing methods, including better control of false discovery rate than existing joint modeling approaches. BinQuasi offers a flexible approach to joint modeling of replicated ChIP-seq data which is preferable to combining the results of replicates analyzed individually. Source code is freely available for download at https://cran.r-project.org/package=BinQuasi, implemented in R. pliu@iastate.edu or egoren@iastate.edu. Supplementary material is available at Bioinformatics online.
Clinical modeling--a critical analysis.
Blobel, Bernd; Goossen, William; Brochhausen, Mathias
2014-01-01
Modeling clinical processes (and their informational representation) is a prerequisite for optimally enabling and supporting high quality and safe care through information and communication technology and meaningful use of gathered information. The paper investigates existing approaches to clinical modeling, thereby systematically analyzing the underlying principles, the consistency with and the integration opportunity to other existing or emerging projects, as well as the correctness of representing the reality of health and health services. The analysis is performed using an architectural framework for modeling real-world systems. In addition, fundamental work on the representation of facts, relations, and processes in the clinical domain by ontologies is applied, thereby including the integration of advanced methodologies such as translational and system medicine. The paper demonstrates fundamental weaknesses and different maturity as well as evolutionary potential in the approaches considered. It offers a development process starting with the business domain and its ontologies, continuing with the Reference Model-Open Distributed Processing (RM-ODP) related conceptual models in the ICT ontology space, the information and the computational view, and concluding with the implementation details represented as engineering and technology view, respectively. The existing approaches reflect at different levels the clinical domain, put the main focus on different phases of the development process instead of first establishing the real business process representation and therefore enable quite differently and partially limitedly the domain experts' involvement. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Tomàs-Gamisans, Màrius; Ferrer, Pau; Albiol, Joan
2016-01-01
Motivation Genome-scale metabolic models (GEMs) are tools that allow predicting a phenotype from a genotype under certain environmental conditions. GEMs have been developed in the last ten years for a broad range of organisms, and are used for multiple purposes such as discovering new properties of metabolic networks, predicting new targets for metabolic engineering, as well as optimizing the cultivation conditions for biochemicals or recombinant protein production. Pichia pastoris is one of the most widely used organisms for heterologous protein expression. There are different GEMs for this methylotrophic yeast of which the most relevant and complete in the published literature are iPP668, PpaMBEL1254 and iLC915. However, these three models differ regarding certain pathways, terminology for metabolites and reactions and annotations. Moreover, GEMs for some species are typically built based on the reconstructed models of related model organisms. In these cases, some organism-specific pathways could be missing or misrepresented. Results In order to provide an updated and more comprehensive GEM for P. pastoris, we have reconstructed and validated a consensus model integrating and merging all three existing models. In this step a comprehensive review and integration of the metabolic pathways included in each one of these three versions was performed. In addition, the resulting iMT1026 model includes a new description of some metabolic processes. Particularly new information described in recently published literature is included, mainly related to fatty acid and sphingolipid metabolism, glycosylation and cell energetics. Finally the reconstructed model was tested and validated, by comparing the results of the simulations with available empirical physiological datasets results obtained from a wide range of experimental conditions, such as different carbon sources, distinct oxygen availability conditions, as well as producing of two different recombinant proteins. In these simulations, the iMT1026 model has shown a better performance than the previous existing models. PMID:26812499
Tomàs-Gamisans, Màrius; Ferrer, Pau; Albiol, Joan
2016-01-01
Genome-scale metabolic models (GEMs) are tools that allow predicting a phenotype from a genotype under certain environmental conditions. GEMs have been developed in the last ten years for a broad range of organisms, and are used for multiple purposes such as discovering new properties of metabolic networks, predicting new targets for metabolic engineering, as well as optimizing the cultivation conditions for biochemicals or recombinant protein production. Pichia pastoris is one of the most widely used organisms for heterologous protein expression. There are different GEMs for this methylotrophic yeast of which the most relevant and complete in the published literature are iPP668, PpaMBEL1254 and iLC915. However, these three models differ regarding certain pathways, terminology for metabolites and reactions and annotations. Moreover, GEMs for some species are typically built based on the reconstructed models of related model organisms. In these cases, some organism-specific pathways could be missing or misrepresented. In order to provide an updated and more comprehensive GEM for P. pastoris, we have reconstructed and validated a consensus model integrating and merging all three existing models. In this step a comprehensive review and integration of the metabolic pathways included in each one of these three versions was performed. In addition, the resulting iMT1026 model includes a new description of some metabolic processes. Particularly new information described in recently published literature is included, mainly related to fatty acid and sphingolipid metabolism, glycosylation and cell energetics. Finally the reconstructed model was tested and validated, by comparing the results of the simulations with available empirical physiological datasets results obtained from a wide range of experimental conditions, such as different carbon sources, distinct oxygen availability conditions, as well as producing of two different recombinant proteins. In these simulations, the iMT1026 model has shown a better performance than the previous existing models.
Vibration study of a vehicle suspension assembly with the finite element method
NASA Astrophysics Data System (ADS)
Cătălin Marinescu, Gabriel; Castravete, Ştefan-Cristian; Dumitru, Nicolae
2017-10-01
The main steps of the present work represent a methodology of analysing various vibration effects over suspension mechanical parts of a vehicle. A McPherson type suspension from an existing vehicle was created using CAD software. Using the CAD model as input, a finite element model of the suspension assembly was developed. Abaqus finite element analysis software was used to pre-process, solve, and post-process the results. Geometric nonlinearities are included in the model. Severe sources of nonlinearities such us friction and contact are also included in the model. The McPherson spring is modelled as linear spring. The analysis include several steps: preload, modal analysis, the reduction of the model to 200 generalized coordinates, a deterministic external excitation, a random excitation that comes from different types of roads. The vibration data used as an input for the simulation were previously obtained by experimental means. Mathematical expressions used for the simulation were also presented in the paper.
Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements
NASA Astrophysics Data System (ADS)
Bakker, M.
2017-12-01
Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.
Elastic and inelastic scattering of neutrons on 238U nucleus
NASA Astrophysics Data System (ADS)
Capote, R.; Trkov, A.; Sin, M.; Herman, M. W.; Soukhovitskiĩ, E. Sh.
2014-04-01
Advanced modelling of neutron induced reactions on the 238U nucleus is aimed at improving our knowledge of neutron scattering. Capture and fission channels are well constrained by available experimental data and neutron standard evaluation. A focus of this contribution is on elastic and inelastic scattering cross sections. The employed nuclear reaction model includes - a new rotational-vibrational dispersive optical model potential coupling the low-lying collective bands of vibrational character observed in even-even actinides; - the Engelbrecht-Weidenmüller transformation allowing for inclusion of compound-direct interference effects; - and a multi-humped fission barrier with absorption in the secondary well described within the optical model for fission. Impact of the advanced modelling on elastic and inelastic scattering cross sections including angular distributions and emission spectra is assessed both by comparison with selected microscopic experimental data and integral criticality benchmarks including measured reaction rates (e.g. JEMIMA, FLAPTOP and BIG TEN). Benchmark calculations provided feedback to improve the reaction modelling. Improvement of existing libraries will be discussed.
NASA Technical Reports Server (NTRS)
Hogan, John; Kang, Sukwon; Cavazzoni, Jim; Levri, Julie; Finn, Cory; Luna, Bernadette (Technical Monitor)
2000-01-01
The objective of this study is to compare incineration and composting in a Mars-based advanced life support (ALS) system. The variables explored include waste pre-processing requirements, reactor sizing and buffer capacities. The study incorporates detailed mathematical models of biomass production and waste processing into an existing dynamic ALS system model. The ALS system and incineration models (written in MATLAB/SIMULINK(c)) were developed at the NASA Ames Research Center. The composting process is modeled using first order kinetics, with different degradation rates for individual waste components (carbohydrates, proteins, fats, cellulose and lignin). The biomass waste streams are generated using modified "Eneray Cascade" crop models, which use light- and dark-cycle temperatures, irradiance, photoperiod, [CO2], planting density, and relative humidity as model inputs. The study also includes an evaluation of equivalent system mass (ESM).
NASA Astrophysics Data System (ADS)
Gasparini, N. M.; Hobley, D. E. J.; Tucker, G. E.; Istanbulluoglu, E.; Adams, J. M.; Nudurupati, S. S.; Hutton, E. W. H.
2014-12-01
Computational models are important tools that can be used to quantitatively understand the evolution of real landscapes. Commonalities exist among most landscape evolution models, although they are also idiosyncratic, in that they are coded in different languages, require different input values, and are designed to tackle a unique set of questions. These differences can make applying a landscape evolution model challenging, especially for novice programmers. In this study, we compare and contrast two landscape evolution models that are designed to tackle similar questions, but the actual model designs are quite different. The first model, CHILD, is over a decade-old and is relatively well-tested, well-developed and well-used. It is coded in C++, operates on an irregular grid and was designed more with function rather than user-experience in mind. In contrast, the second model, Landlab, is relatively new and was designed to be accessible to a wide range of scientists, including those who have not previously used or developed a numerical model. Landlab is coded in Python, a relatively easy language for the non-proficient programmer, and has the ability to model landscapes described on both regular and irregular grids. We present landscape simulations from both modeling platforms. Our goal is to illustrate best practices for implementing a new process module in a landscape evolution model, and therefore the simulations are applicable regardless of the modeling platform. We contrast differences and highlight similarities between the use of the two models, including setting-up the model and input file for different evolutionary scenarios, computational time, and model output. Whenever possible, we compare model output with analytical solutions and illustrate the effects, or lack thereof, of a uniform vs. non-uniform grid. Our simulations focus on implementing a single process, including detachment-limited or transport-limited fluvial bedrock incision and linear or non-linear diffusion of material on hillslopes. We also illustrate the steps necessary to couple processes together, for example, detachment-limited fluvial bedrock incision with linear diffusion on hillslopes. Trade-offs exist between the two modeling platforms, and these are primarily in speed and ease-of-use.
Collaborative Drug Therapy Management: Case Studies of Three Community-Based Models of Care
Snyder, Margie E.; Earl, Tara R.; Greenberg, Michael; Heisler, Holly; Revels, Michelle; Matson-Koffman, Dyann
2015-01-01
Collaborative drug therapy management agreements are a strategy for expanding the role of pharmacists in team-based care with other providers. However, these agreements have not been widely implemented. This study describes the features of existing provider–pharmacist collaborative drug therapy management practices and identifies the facilitators and barriers to implementing such services in community settings. We conducted in-depth, qualitative interviews in 2012 in a federally qualified health center, an independent pharmacy, and a retail pharmacy chain. Facilitators included 1) ensuring pharmacists were adequately trained; 2) obtaining stakeholder (eg, physician) buy-in; and 3) leveraging academic partners. Barriers included 1) lack of pharmacist compensation; 2) hesitation among providers to trust pharmacists; 3) lack of time and resources; and 4) existing informal collaborations that resulted in reduced interest in formal agreements. The models described in this study could be used to strengthen clinical–community linkages through team-based care, particularly for chronic disease prevention and management. PMID:25811494
Assessment of existing Sierra/Fuego capabilities related to grid-to-rod-fretting (GTRF).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Daniel Zack; Rodriguez, Salvador B.
2011-06-01
The following report presents an assessment of existing capabilities in Sierra/Fuego applied to modeling several aspects of grid-to-rod-fretting (GTRF) including: fluid dynamics, heat transfer, and fluid-structure interaction. We compare the results of a number of Fuego simulations with relevant sources in the literature to evaluate the accuracy, efficiency, and robustness of using Fuego to model the aforementioned aspects. Comparisons between flow domains that include the full fuel rod length vs. a subsection of the domain near the spacer show that tremendous efficiency gains can be obtained by truncating the domain without loss of accuracy. Thermal analysis reveals the extent tomore » which heat transfer from the fuel rods to the coolant is improved by the swirling flow created by the mixing vanes. Lastly, coupled fluid-structure interaction analysis shows that the vibrational modes of the fuel rods filter out high frequency turbulent pressure fluctuations. In general, these results allude to interesting phenomena for which further investigation could be quite fruitful.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
TESP combines existing domain simulators in the electric power grid, with new transactive agents, growth models and evaluation scripts. The existing domain simulators include GridLAB-D for the distribution grid and single-family residential buildings, MATPOWER for transmission and bulk generation, and EnergyPlus for large buildings. More are planned for subsequent versions of TESP. The new elements are: TEAgents - simulate market participants and transactive systems for market clearing. Some of this functionality was extracted from GridLAB-D and implemented in Python for customization by PNNL and others; Growth Model - a means for simulating system changes over a multiyear period, including bothmore » normal load growth and specific investment decisions. Customizable in Python code; and Evaluation Script - a means of evaluating different transactive systems through customizable post-processing in Python code. TESP provides a method for other researchers and vendors to design transactive systems, and test them in a virtual environment. It allows customization of the key components by modifying Python code.« less
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.
Fitterer, Jessica L; Nelson, Trisalyn A
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).
Bi-level Multi-Source Learning for Heterogeneous Block-wise Missing Data
Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M.; Ye, Jieping
2013-01-01
Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified “bi-level” learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. PMID:23988272
Bi-level multi-source learning for heterogeneous block-wise missing data.
Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M; Ye, Jieping
2014-11-15
Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified "bi-level" learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. © 2013 Elsevier Inc. All rights reserved.
Straddling Interdisciplinary Seams: Working Safely in the Field, Living Dangerously With a Model
NASA Astrophysics Data System (ADS)
Light, B.; Roberts, A.
2016-12-01
Many excellent proposals for observational work have included language detailing how the proposers will appropriately archive their data and publish their results in peer-reviewed literature so that they may be readily available to the modeling community for parameterization development. While such division of labor may be both practical and inevitable, the assimilation of observational results and the development of observationally-based parameterizations of physical processes require care and feeding. Key questions include: (1) Is an existing parameterization accurate, consistent, and general? If not, it may be ripe for additional physics. (2) Do there exist functional working relationships between human modeler and human observationalist? If not, one or more may need to be initiated and cultivated. (3) If empirical observation and model development are a chicken/egg problem, how, given our lack of prescience and foreknowledge, can we better design observational science plans to meet the eventual demands of model parameterization? (4) Will the addition of new physics "break" the model? If so, then the addition may be imperative. In the context of these questions, we will make retrospective and forward-looking assessments of a now-decade-old numerical parameterization to treat the partitioning of solar energy at the Earth's surface where sea ice is present. While this so called "Delta-Eddington Albedo Parameterization" is currently employed in the widely-used Los Alamos Sea Ice Model (CICE) and appears to be standing the tests of accuracy, consistency, and generality, we will highlight some ideas for its ongoing development and improvement.
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime
Fitterer, Jessica L.; Nelson, Trisalyn A.
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016
A Standard Kinematic Model for Flight Simulation at NASA Ames
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1975-01-01
A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.
User's Manual for Data for Validating Models for PV Module Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marion, W.; Anderberg, A.; Deline, C.
2014-04-01
This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.
NASA Technical Reports Server (NTRS)
Blotzer, Michael J.; Woods, Jody L.
2009-01-01
This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.
Statistical Modeling for Radiation Hardness Assurance
NASA Technical Reports Server (NTRS)
Ladbury, Raymond L.
2014-01-01
We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
Computer code for off-design performance analysis of radial-inflow turbines with rotor blade sweep
NASA Technical Reports Server (NTRS)
Meitner, P. L.; Glassman, A. J.
1983-01-01
The analysis procedure of an existing computer program was extended to include rotor blade sweep, to model the flow more accurately at the rotor exit, and to provide more detail to the loss model. The modeling changes are described and all analysis equations and procedures are presented. Program input and output are described and are illustrated by an example problem. Results obtained from this program and from a previous program are compared with experimental data.
Study of an engine flow diverter system for a large scale ejector powered aircraft model
NASA Technical Reports Server (NTRS)
Springer, R. J.; Langley, B.; Plant, T.; Hunter, L.; Brock, O.
1981-01-01
Requirements were established for a conceptual design study to analyze and design an engine flow diverter system and to include accommodations for an ejector system in an existing 3/4 scale fighter model equipped with YJ-79 engines. Model constraints were identified and cost-effective limited modification was proposed to accept the ejectors, ducting and flow diverter valves. Complete system performance was calculated and a versatile computer program capable of analyzing any ejector system was developed.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
NASA Astrophysics Data System (ADS)
Savitri, D.
2018-01-01
This articel discusses a predator prey model with anti-predator on intermediate predator using ratio dependent functional responses. Dynamical analysis performed on the model includes determination of equilibrium point, stability and simulation. Three kinds of equilibrium points have been discussed, namely the extinction of prey point, the extinction of intermediate predator point and the extinction of predator point are exists under certain conditions. It can be shown that the result of numerical simulations are in accordance with analitical results
Integrable Time-Dependent Quantum Hamiltonians
NASA Astrophysics Data System (ADS)
Sinitsyn, Nikolai A.; Yuzbashyan, Emil A.; Chernyak, Vladimir Y.; Patra, Aniket; Sun, Chen
2018-05-01
We formulate a set of conditions under which the nonstationary Schrödinger equation with a time-dependent Hamiltonian is exactly solvable analytically. The main requirement is the existence of a non-Abelian gauge field with zero curvature in the space of system parameters. Known solvable multistate Landau-Zener models satisfy these conditions. Our method provides a strategy to incorporate time dependence into various quantum integrable models while maintaining their integrability. We also validate some prior conjectures, including the solution of the driven generalized Tavis-Cummings model.
Microeconomics of 300-mm process module control
NASA Astrophysics Data System (ADS)
Monahan, Kevin M.; Chatterjee, Arun K.; Falessi, Georges; Levy, Ady; Stoller, Meryl D.
2001-08-01
Simple microeconomic models that directly link metrology, yield, and profitability are rare or non-existent. In this work, we validate and apply such a model. Using a small number of input parameters, we explain current yield management practices in 200 mm factories. The model is then used to extrapolate requirements for 300 mm factories, including the impact of simultaneous technology transitions to 130nm lithography and integrated metrology. To support our conclusions, we use examples relevant to factory-wide photo module control.
Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised
NASA Technical Reports Server (NTRS)
Lim, K. B.; Giesy, D. P.
2000-01-01
Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.
Aging, Breast Cancer and the Mouse Model
2005-05-01
architecture and function of the of normal human cells in culture ( Hayflick , 1965). This limit surrounding tissue and stimulate (or inhibit) the... LIMITATION 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES USAMRMC a. REPORT b. ABSTRACT c. THIS PAGE 19b. TELEPHONE NUMBER (include area U...mammary cancers in the mouse and that these tumors have strikingly similar histology. Nonetheless, several limitations exists to this model system and
Force on Force Modeling with Formal Task Structures and Dynamic Geometry
2017-03-24
task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test
A Model for Real-Time Data Reputation Via Cyber Telemetry
2016-06-01
TIME DATA REPUTATION VIA CYBER TELEMETRY by Beau M. Houser June 2016 Thesis Advisor: Dorothy E. Denning Co-Advisor: Phyllis Schneck...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...Master’s Thesis 4. TITLE AND SUBTITLE A MODEL FOR REAL- TIME DATA REPUTATION VIA CYBER TELEMETRY 5. FUNDING NUMBERS 6. AUTHOR(S) Beau M
First-Principles Modeling of Hydrogen Storage in Metal Hydride Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Karl Johnson
The objective of this project is to complement experimental efforts of MHoCE partners by using state-of-the-art theory and modeling to study the structure, thermodynamics, and kinetics of hydrogen storage materials. Specific goals include prediction of the heats of formation and other thermodynamic properties of alloys from first principles methods, identification of new alloys that can be tested experimentally, calculation of surface and energetic properties of nanoparticles, and calculation of kinetics involved with hydrogenation and dehydrogenation processes. Discovery of new metal hydrides with enhanced properties compared with existing materials is a critical need for the Metal Hydride Center of Excellence. Newmore » materials discovery can be aided by the use of first principles (ab initio) computational modeling in two ways: (1) The properties, including mechanisms, of existing materials can be better elucidated through a combined modeling/experimental approach. (2) The thermodynamic properties of novel materials that have not been made can, in many cases, be quickly screened with ab initio methods. We have used state-of-the-art computational techniques to explore millions of possible reaction conditions consisting of different element spaces, compositions, and temperatures. We have identified potentially promising single- and multi-step reactions that can be explored experimentally.« less
Publishing and sharing of hydrologic models through WaterHUB
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.
2011-12-01
Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.
An investigation of modelling and design for software service applications.
Anjum, Maria; Budgen, David
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.
Coarse-Graining Polymer Field Theory for Fast and Accurate Simulations of Directed Self-Assembly
NASA Astrophysics Data System (ADS)
Liu, Jimmy; Delaney, Kris; Fredrickson, Glenn
To design effective manufacturing processes using polymer directed self-assembly (DSA), the semiconductor industry benefits greatly from having a complete picture of stable and defective polymer configurations. Field-theoretic simulations are an effective way to study these configurations and predict defect populations. Self-consistent field theory (SCFT) is a particularly successful theory for studies of DSA. Although other models exist that are faster to simulate, these models are phenomenological or derived through asymptotic approximations, often leading to a loss of accuracy relative to SCFT. In this study, we employ our recently-developed method to produce an accurate coarse-grained field theory for diblock copolymers. The method uses a force- and stress-matching strategy to map output from SCFT simulations into parameters for an optimized phase field model. This optimized phase field model is just as fast as existing phenomenological phase field models, but makes more accurate predictions of polymer self-assembly, both in bulk and in confined systems. We study the performance of this model under various conditions, including its predictions of domain spacing, morphology and defect formation energies. Samsung Electronics.
Planetary Boundary Layer Simulation Using TASS
NASA Technical Reports Server (NTRS)
Schowalter, David G.; DeCroix, David S.; Lin, Yuh-Lang; Arya, S. Pal; Kaplan, Michael
1996-01-01
Boundary conditions to an existing large-eddy simulation model have been changed in order to simulate turbulence in the atmospheric boundary layer. Several options are now available, including the use of a surface energy balance. In addition, we compare convective boundary layer simulations with the Wangara and Minnesota field experiments as well as with other model results. We find excellent agreement of modelled mean profiles of wind and temperature with observations and good agreement for velocity variances. Neutral boundary simulation results are compared with theory and with previously used models. Agreement with theory is reasonable, while agreement with previous models is excellent.
Comparing Blast Effects on Human Torso Finite Element Model against Existing Lethality Curves
2010-07-15
vertebrae, intervertebral discs, ribs, cartilage, sternum, scapula, and clavicle . The internal organs include the heart and aorta, lungs and trachea...Thoracic Vertebrae Intervertebral Disc Scapula Clavicle Heritage Style Viewgraphs6 HTFEM Development Internal Organs Ten-noded tetrahedral
Ask Questions to Encourage Questions Asked
ERIC Educational Resources Information Center
belcastro, sarah-marie
2017-01-01
We delineate some types of structured practice (modeling, requests, feedback, and space-making) that help students learn to pose appropriate questions and to initiate exploration of those questions. Developing skills requires practice, so we suggest ways to embed structured practice into existing class sessions. Including structured practice is…
Community Shared Solar: Policy and Regulatory Considerations (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
This brochure explores the ways in which the shared solar business model interacts with existing policy and regulations, including net metering, tax credits, and securities regulation. It presents some of the barriers that shared solar projects may face, and provides options for creating a supportive policy environment.
AE9/AP9/SPM Radiation Environment Model: User’s Guide
2014-02-18
5776 DTIC COPY NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data included in this document for any purpose other...specifications, or other data does not license the holder or any other person or corporation; or convey any rights or permission to manufacture, use...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
Agent And Component Object Framework For Concept Design Modeling Of Mobile Cyber Physical Systems
2018-03-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters
Uncorrelated Encounter Model of the National Airspace System, Version 2.0
2013-08-19
can exist to certify avoidance systems for operational use. Evaluations typically include flight tests, operational impact studies, and simulation of...appropriate for large-scale air traffic impact studies— for example, examination of sector loading or conflict rates. The focus here includes two types of...between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters of sufficient fidelity in the available data
Periodic and chaotic oscillations in a tumor and immune system interaction model with three delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bi, Ping; Center for Partial Differential Equations, East China Normal University, 500 Dongchuan Rd., Shanghai 200241; Ruan, Shigui, E-mail: ruan@math.miami.edu
2014-06-15
In this paper, a tumor and immune system interaction model consisted of two differential equations with three time delays is considered in which the delays describe the proliferation of tumor cells, the process of effector cells growth stimulated by tumor cells, and the differentiation of immune effector cells, respectively. Conditions for the asymptotic stability of equilibria and existence of Hopf bifurcations are obtained by analyzing the roots of a second degree exponential polynomial characteristic equation with delay dependent coefficients. It is shown that the positive equilibrium is asymptotically stable if all three delays are less than their corresponding critical valuesmore » and Hopf bifurcations occur if any one of these delays passes through its critical value. Numerical simulations are carried out to illustrate the rich dynamical behavior of the model with different delay values including the existence of regular and irregular long periodic oscillations.« less
NASA Technical Reports Server (NTRS)
1990-01-01
Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.
Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, Heather
2015-01-01
To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.
Stochastic Simulation of Actin Dynamics Reveals the Role of Annealing and Fragmentation
Fass, Joseph; Pak, Chi; Bamburg, James; Mogilner, Alex
2008-01-01
Recent observations of F-actin dynamics call for theoretical models to interpret and understand the quantitative data. A number of existing models rely on simplifications and do not take into account F-actin fragmentation and annealing. We use Gillespie’s algorithm for stochastic simulations of the F-actin dynamics including fragmentation and annealing. The simulations vividly illustrate that fragmentation and annealing have little influence on the shape of the polymerization curve and on nucleotide profiles within filaments but drastically affect the F-actin length distribution, making it exponential. We find that recent surprising measurements of high length diffusivity at the critical concentration cannot be explained by fragmentation and annealing events unless both fragmentation rates and frequency of undetected fragmentation and annealing events are greater than previously thought. The simulations compare well with experimentally measured actin polymerization data and lend additional support to a number of existing theoretical models. PMID:18279896
Hypersurface-deformation algebroids and effective spacetime models
NASA Astrophysics Data System (ADS)
Bojowald, Martin; Büyükçam, Umut; Brahma, Suddhasattwa; D'Ambrosio, Fabio
2016-11-01
In canonical gravity, covariance is implemented by brackets of hypersurface-deformation generators forming a Lie algebroid. Lie-algebroid morphisms, therefore, allow one to relate different versions of the brackets that correspond to the same spacetime structure. An application to examples of modified brackets found mainly in models of loop quantum gravity can, in some cases, map the spacetime structure back to the classical Riemannian form after a field redefinition. For one type of quantum corrections (holonomies), signature change appears to be a generic feature of effective spacetime, and it is shown here to be a new quantum spacetime phenomenon which cannot be mapped to an equivalent classical structure. In low-curvature regimes, our constructions not only prove the existence of classical spacetime structures assumed elsewhere in models of loop quantum cosmology, they also show the existence of additional quantum corrections that have not always been included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Peng; Fan, Zheng, E-mail: ZFAN@ntu.edu.sg; Zhou, Yu
2016-07-15
Nonlinear guided waves have been investigated widely in simple geometries, such as plates, pipe and shells, where analytical solutions have been developed. This paper extends the application of nonlinear guided waves to waveguides with arbitrary cross sections. The criteria for the existence of nonlinear guided waves were summarized based on the finite deformation theory and nonlinear material properties. Numerical models were developed for the analysis of nonlinear guided waves in complex geometries, including nonlinear Semi-Analytical Finite Element (SAFE) method to identify internal resonant modes in complex waveguides, and Finite Element (FE) models to simulate the nonlinear wave propagation at resonantmore » frequencies. Two examples, an aluminum plate and a steel rectangular bar, were studied using the proposed numerical model, demonstrating the existence of nonlinear guided waves in such structures and the energy transfer from primary to secondary modes.« less
Numerical simulation for the air entrainment of aerated flow with an improved multiphase SPH model
NASA Astrophysics Data System (ADS)
Wan, Hang; Li, Ran; Pu, Xunchi; Zhang, Hongwei; Feng, Jingjie
2017-11-01
Aerated flow is a complex hydraulic phenomenon that exists widely in the field of environmental hydraulics. It is generally characterised by large deformation and violent fragmentation of the free surface. Compared to Euler methods (volume of fluid (VOF) method or rigid-lid hypothesis method), the existing single-phase Smooth Particle Hydrodynamics (SPH) method has performed well for solving particle motion. A lack of research on interphase interaction and air concentration, however, has affected the application of SPH model. In our study, an improved multiphase SPH model is presented to simulate aeration flows. A drag force was included in the momentum equation to ensure accuracy of the air particle slip velocity. Furthermore, a calculation method for air concentration is developed to analyse the air entrainment characteristics. Two studies were used to simulate the hydraulic and air entrainment characteristics. And, compared with the experimental results, the simulation results agree with the experimental results well.
Monaural and binaural processing of complex waveforms
NASA Astrophysics Data System (ADS)
Trahiotis, Constantine; Bernstein, Leslie R.
1992-01-01
Our research concerned the manners by which the monaural and binaural auditory systems process information in complex sounds. Substantial progress was made in three areas, consistent with the ojectives outlined in the original proposal. (1) New electronic equipment, including a NeXT computer was purchased, installed and interfaced with the existing laboratory. Software was developed for generating the necessary complex digital stimuli and for running behavioral experiments utilizing those stimuli. (2) Monaural experiments showed that the CMR is not obtained successively and is reduced or non-existent when the flanking bands are pulsed rather than presented continuously. Binaural investigations revealed that the detectability of a tonal target in a masking level difference paradigm could be degraded by the presence of a spectrally remote interfering tone. (3) In collaboration with Dr. Richard Stem, theoretical efforts included the explication and evaluation of a weighted-image model of binaural hearing, attempts to extend the Stern-Colbum position-variable model to account for many crucial lateralization and localization data gathered over the past 50 years, and the continuation of efforts to incorporate into a general model notions that lateralization and localization of spectrally-rich stimuli depend upon the patterns of neural activity within a plane defined by frequency and interaural delay.
On accuracy, privacy, and complexity in the identification problem
NASA Astrophysics Data System (ADS)
Beekhof, F.; Voloshynovskiy, S.; Koval, O.; Holotyak, T.
2010-02-01
This paper presents recent advances in the identification problem taking into account the accuracy, complexity and privacy leak of different decoding algorithms. Using a model of different actors from literature, we show that it is possible to use more accurate decoding algorithms using reliability information without increasing the privacy leak relative to algorithms that only use binary information. Existing algorithms from literature have been modified to take advantage of reliability information, and we show that a proposed branch-and-bound algorithm can outperform existing work, including the enhanced variants.
Air Pollution Source/receptor Relationships in South Coast Air Basin, CA
NASA Astrophysics Data System (ADS)
Gao, Ning
This research project includes the application of some existing receptor models to study the air pollution source/receptor relationships in the South Coast Air Basin (SoCAB) of southern California, the development of a new receptor model and the testing and the modifications of some existing models. These existing receptor models used include principal component factor analysis (PCA), potential source contribution function (PSCF) analysis, Kohonen's neural network combined with Prim's minimal spanning tree (TREE-MAP), and direct trilinear decomposition followed by a matrix reconstruction. The ambient concentration measurements used in this study are a subset of the data collected during the 1987 field exercise of Southern California Air Quality Study (SCAQS). It consists of a number of gaseous and particulate pollutants analyzed from samples collected by SCAQS samplers at eight sampling sites, Anaheim, Azusa, Burbank, Claremont, Downtown Los Angeles, Hawthorne, Long Beach, and Rubidoux. Based on the information of emission inventories, meteorology and ambient concentrations, this receptor modeling study has revealed mechanisms that influence the air quality in SoCAB. Some of the mechanisms affecting the air quality in SoCAB that were revealed during this study include the following aspects. The SO_2 collected at sampling sites is mainly contributed by refineries in the coastal area and the ships equipped with oil-fired boilers off shore. Combustion of fossil fuel by automobiles dominates the emission of NO_{rm x} that is subsequently transformed and collected at sampling sites. Electric power plants also contribute HNO_3 to the sampling sites. A large feedlot in the eastern region of SoCAB has been identified as the major source of NH_3. Possible contributions from other industrial sources such as smelters and incinerators were also revealed. The results of this study also suggest the possibility of DMS (dimethylsulfide) and NH_3 emissions from off-shore sediments that have been contaminated by waste sludge disposal. The study also discovered that non-anthropogenic sources account for the observation of many chemical components being brought to the sampling sites, such as seasalt particles, soil particles, and Cl emission from Mojave Desert. The potential and limitation of the receptor models have been evaluated and some modifications have been made to improve the value of the models. A source apportionment method has been developed based on the application results of the potential source contribution function (PSCF) analysis.
Luo, Gang; Stone, Bryan L; Johnson, Michael D; Nkoy, Flory L
2016-03-07
In young children, bronchiolitis is the most common illness resulting in hospitalization. For children less than age 2, bronchiolitis incurs an annual total inpatient cost of $1.73 billion. Each year in the United States, 287,000 emergency department (ED) visits occur because of bronchiolitis, with a hospital admission rate of 32%-40%. Due to a lack of evidence and objective criteria for managing bronchiolitis, ED disposition decisions (hospital admission or discharge to home) are often made subjectively, resulting in significant practice variation. Studies reviewing admission need suggest that up to 29% of admissions from the ED are unnecessary. About 6% of ED discharges for bronchiolitis result in ED returns with admission. These inappropriate dispositions waste limited health care resources, increase patient and parental distress, expose patients to iatrogenic risks, and worsen outcomes. Existing clinical guidelines for bronchiolitis offer limited improvement in patient outcomes. Methodological shortcomings include that the guidelines provide no specific thresholds for ED decisions to admit or to discharge, have an insufficient level of detail, and do not account for differences in patient and illness characteristics including co-morbidities. Predictive models are frequently used to complement clinical guidelines, reduce practice variation, and improve clinicians' decision making. Used in real time, predictive models can present objective criteria supported by historical data for an individualized disease management plan and guide admission decisions. However, existing predictive models for ED patients with bronchiolitis have limitations, including low accuracy and the assumption that the actual ED disposition decision was appropriate. To date, no operational definition of appropriate admission exists. No model has been built based on appropriate admissions, which include both actual admissions that were necessary and actual ED discharges that were unsafe. The goal of this study is to develop a predictive model to guide appropriate hospital admission for ED patients with bronchiolitis. This study will: (1) develop an operational definition of appropriate hospital admission for ED patients with bronchiolitis, (2) develop and test the accuracy of a new model to predict appropriate hospital admission for an ED patient with bronchiolitis, and (3) conduct simulations to estimate the impact of using the model on bronchiolitis outcomes. We are currently extracting administrative and clinical data from the enterprise data warehouse of an integrated health care system. Our goal is to finish this study by the end of 2019. This study will produce a new predictive model that can be operationalized to guide and improve disposition decisions for ED patients with bronchiolitis. Broad use of the model would reduce iatrogenic risk, patient and parental distress, health care use, and costs and improve outcomes for bronchiolitis patients.
NASA Astrophysics Data System (ADS)
Marsolat, F.; De Marzi, L.; Pouzoulet, F.; Mazal, A.
2016-01-01
In proton therapy, the relative biological effectiveness (RBE) depends on various types of parameters such as linear energy transfer (LET). An analytical model for LET calculation exists (Wilkens’ model), but secondary particles are not included in this model. In the present study, we propose a correction factor, L sec, for Wilkens’ model in order to take into account the LET contributions of certain secondary particles. This study includes secondary protons and deuterons, since the effects of these two types of particles can be described by the same RBE-LET relationship. L sec was evaluated by Monte Carlo (MC) simulations using the GATE/GEANT4 platform and was defined by the ratio of the LET d distributions of all protons and deuterons and only primary protons. This method was applied to the innovative Pencil Beam Scanning (PBS) delivery systems and L sec was evaluated along the beam axis. This correction factor indicates the high contribution of secondary particles in the entrance region, with L sec values higher than 1.6 for a 220 MeV clinical pencil beam. MC simulations showed the impact of pencil beam parameters, such as mean initial energy, spot size, and depth in water, on L sec. The variation of L sec with these different parameters was integrated in a polynomial function of the L sec factor in order to obtain a model universally applicable to all PBS delivery systems. The validity of this correction factor applied to Wilkens’ model was verified along the beam axis of various pencil beams in comparison with MC simulations. A good agreement was obtained between the corrected analytical model and the MC calculations, with mean-LET deviations along the beam axis less than 0.05 keV μm-1. These results demonstrate the efficacy of our new correction of the existing LET model in order to take into account secondary protons and deuterons along the pencil beam axis.
Understanding the aetiology and resolution of chronic otitis media from animal and human studies
Thornton, Ruth B.; Kirkham, Lea-Ann S.; Kerschner, Joseph E.; Cheeseman, Michael T.
2017-01-01
ABSTRACT Inflammation of the middle ear, known clinically as chronic otitis media, presents in different forms, such as chronic otitis media with effusion (COME; glue ear) and chronic suppurative otitis media (CSOM). These are highly prevalent diseases, especially in childhood, and lead to significant morbidity worldwide. However, much remains unclear about this disease, including its aetiology, initiation and perpetuation, and the relative roles of mucosal and leukocyte biology, pathogens, and Eustachian tube function. Chronic otitis media is commonly modelled in mice but most existing models only partially mimic human disease and many are syndromic. Nevertheless, these models have provided insights into potential disease mechanisms, and have implicated altered immune signalling, mucociliary function and Eustachian tube function as potential predisposing mechanisms. Clinical studies of chronic otitis media have yet to implicate a particular molecular pathway or mechanism, and current human genetic studies are underpowered. We also do not fully understand how existing interventions, such as tympanic membrane repair, work, nor how chronic otitis media spontaneously resolves. This Clinical Puzzle article describes our current knowledge of chronic otitis media and the existing research models for this condition. It also identifies unanswered questions about its pathogenesis and treatment, with the goal of advancing our understanding of this disease to aid the development of novel therapeutic interventions. PMID:29125825
Evaluation of new collision-pair selection models in DSMC
NASA Astrophysics Data System (ADS)
Akhlaghi, Hassan; Roohi, Ehsan
2017-10-01
The current paper investigates new collision-pair selection procedures in a direct simulation Monte Carlo (DSMC) method. Collision partner selection based on the random procedure from nearest neighbor particles and deterministic selection of nearest neighbor particles have already been introduced as schemes that provide accurate results in a wide range of problems. In the current research, new collision-pair selections based on the time spacing and direction of the relative movement of particles are introduced and evaluated. Comparisons between the new and existing algorithms are made considering appropriate test cases including fluctuations in homogeneous gas, 2D equilibrium flow, and Fourier flow problem. Distribution functions for number of particles and collisions in cell, velocity components, and collisional parameters (collision separation, time spacing, relative velocity, and the angle between relative movements of particles) are investigated and compared with existing analytical relations for each model. The capability of each model in the prediction of the heat flux in the Fourier problem at different cell numbers, numbers of particles, and time steps is examined. For new and existing collision-pair selection schemes, the effect of an alternative formula for the number of collision-pair selections and avoiding repetitive collisions are investigated via the prediction of the Fourier heat flux. The simulation results demonstrate the advantages and weaknesses of each model in different test cases.
NASA Astrophysics Data System (ADS)
De Lellis, Giovanni
2018-05-01
The discovery of the Higgs boson has fully confirmed the Standard Model of particles and fields. Nevertheless, there are still fundamental phenomena, like the existence of dark matter and the baryon asymmetry of the Universe, which deserve an explanation that could come from the discovery of new particles. The SHiP experiment at CERN meant to search for very weakly coupled particles in the few GeV mass domain has been recently proposed. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored. A beam dump facility using high intensity 400 GeV protons is a copious source of such unknown particles in the GeV mass range. The beam dump is also a copious source of neutrinos and in particular it is an ideal source of tau neutrinos, the less known particle in the Standard Model. Indeed, tau anti-neutrinos have not been directly observed so far. We report the physics potential of such an experiment including the tau neutrino magnetic moment.
Impersonating the Standard Model Higgs boson: Alignment without decoupling
Carena, Marcela; Low, Ian; Shah, Nausheen R.; ...
2014-04-03
In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derivedmore » in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. In addition, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A – tan β parameter space.« less
A practical model for pressure probe system response estimation (with review of existing models)
NASA Astrophysics Data System (ADS)
Hall, B. F.; Povey, T.
2018-04-01
The accurate estimation of the unsteady response (bandwidth) of pneumatic pressure probe systems (probe, line and transducer volume) is a common practical problem encountered in the design of aerodynamic experiments. Understanding the bandwidth of the probe system is necessary to capture unsteady flow features accurately. Where traversing probes are used, the desired traverse speed and spatial gradients in the flow dictate the minimum probe system bandwidth required to resolve the flow. Existing approaches for bandwidth estimation are either complex or inaccurate in implementation, so probes are often designed based on experience. Where probe system bandwidth is characterized, it is often done experimentally, requiring careful experimental set-up and analysis. There is a need for a relatively simple but accurate model for estimation of probe system bandwidth. A new model is presented for the accurate estimation of pressure probe bandwidth for simple probes commonly used in wind tunnel environments; experimental validation is provided. An additional, simple graphical method for air is included for convenience.
Zoonotic Transmission of Waterborne Disease: A Mathematical Model.
Waters, Edward K; Hamilton, Andrew J; Sidhu, Harvinder S; Sidhu, Leesa A; Dunbar, Michelle
2016-01-01
Waterborne parasites that infect both humans and animals are common causes of diarrhoeal illness, but the relative importance of transmission between humans and animals and vice versa remains poorly understood. Transmission of infection from animals to humans via environmental reservoirs, such as water sources, has attracted attention as a potential source of endemic and epidemic infections, but existing mathematical models of waterborne disease transmission have limitations for studying this phenomenon, as they only consider contamination of environmental reservoirs by humans. This paper develops a mathematical model that represents the transmission of waterborne parasites within and between both animal and human populations. It also improves upon existing models by including animal contamination of water sources explicitly. Linear stability analysis and simulation results, using realistic parameter values to describe Giardia transmission in rural Australia, show that endemic infection of an animal host with zoonotic protozoa can result in endemic infection in human hosts, even in the absence of person-to-person transmission. These results imply that zoonotic transmission via environmental reservoirs is important.
Transforming GIS data into functional road models for large-scale traffic simulation.
Wilkie, David; Sewall, Jason; Lin, Ming C
2012-06-01
There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.
Visualization of RNA structure models within the Integrative Genomics Viewer.
Busan, Steven; Weeks, Kevin M
2017-07-01
Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie
The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with othermore » experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.« less
Integration and segregation in auditory streaming
NASA Astrophysics Data System (ADS)
Almonte, Felix; Jirsa, Viktor K.; Large, Edward W.; Tuller, Betty
2005-12-01
We aim to capture the perceptual dynamics of auditory streaming using a neurally inspired model of auditory processing. Traditional approaches view streaming as a competition of streams, realized within a tonotopically organized neural network. In contrast, we view streaming to be a dynamic integration process which resides at locations other than the sensory specific neural subsystems. This process finds its realization in the synchronization of neural ensembles or in the existence of informational convergence zones. Our approach uses two interacting dynamical systems, in which the first system responds to incoming acoustic stimuli and transforms them into a spatiotemporal neural field dynamics. The second system is a classification system coupled to the neural field and evolves to a stationary state. These states are identified with a single perceptual stream or multiple streams. Several results in human perception are modelled including temporal coherence and fission boundaries [L.P.A.S. van Noorden, Temporal coherence in the perception of tone sequences, Ph.D. Thesis, Eindhoven University of Technology, The Netherlands, 1975], and crossing of motions [A.S. Bregman, Auditory Scene Analysis: The Perceptual Organization of Sound, MIT Press, 1990]. Our model predicts phenomena such as the existence of two streams with the same pitch, which cannot be explained by the traditional stream competition models. An experimental study is performed to provide proof of existence of this phenomenon. The model elucidates possible mechanisms that may underlie perceptual phenomena.
Calculating CO2 uptake for existing concrete structures during and after service life.
Andersson, Ronny; Fridh, Katja; Stripple, Håkan; Häglund, Martin
2013-10-15
This paper presents a model that can calculate the uptake of CO2 in all existing concrete structures, including its uptake after service life. This is important for the calculation of the total CO2 uptake in the society and its time dependence. The model uses the well-documented cement use and knowledge of how the investments are distributed throughout the building sector to estimate the stock of concrete applications in a country. The depth of carbonation of these applications is estimated using two models, one theoretical and one based on field measurements. The maximum theoretical uptake potential is defined as the amount of CO2 that is emitted during calcination at the production of Portland cement, but the model can also, with some adjustments, be used for the other cement types. The model has been applied on data from Sweden and the results show a CO2 uptake in 2011 in all existing structures of about 300,000 tonnes, which corresponds to about 17% of the total emissions (calcination and fuel) from the production of new cement for use in Sweden in the same year. The study also shows that in the years 2030 and 2050, an increase in the uptake in crushed concrete, from 12,000 tonnes today to 200,000 and 500,000 tonnes of CO2, respectively, could be possible if the waste handling is redesigned.
Microfluidic System Simulation Including the Electro-Viscous Effect
NASA Technical Reports Server (NTRS)
Rojas, Eileen; Chen, C. P.; Majumdar, Alok
2007-01-01
This paper describes a practical approach using a general purpose lumped-parameter computer program, GFSSP (Generalized Fluid System Simulation Program) for calculating flow distribution in a network of micro-channels including electro-viscous effects due to the existence of electrical double layer (EDL). In this study, an empirical formulation for calculating an effective viscosity of ionic solutions based on dimensional analysis is described to account for surface charge and bulk fluid conductivity, which give rise to electro-viscous effect in microfluidics network. Two dimensional slit micro flow data was used to determine the model coefficients. Geometry effect is then included through a Poiseuille number correlation in GFSSP. The bi-power model was used to calculate flow distribution of isotropically etched straight channel and T-junction microflows involving ionic solutions. Performance of the proposed model is assessed against experimental test data.
The epidemiology of pelvic floor disorders and childbirth: an update
Hallock, Jennifer L.; Handa, Victoria L.
2015-01-01
SYNOPSIS Using a life span model, this article presents new scientific findings regarding risk factors for pelvic floor disorders (PFDs), with a focus on the role of childbirth in the development of single or multiple co-existing PFDs. Phase I of the life span model includes predisposing factors such as genetic predisposition and race. Phase II of the model includes inciting factors such as obstetric events. Prolapse, urinary incontinence (UI) and fecal incontinence (FI) are more common among vaginally parous women, although the impact of vaginal delivery on risk of FI is less dramatic than for prolapse and UI. Finally, Phase III includes intervening factors such as age and obesity. Both age and obesity are associated with prevalence of PFDs. The prevention and treatment of obesity is an important component to PFD prevention. PMID:26880504
Comparing functional responses in predator-infected eco-epidemics models.
Haque, Mainul; Rahman, Md Sabiar; Venturino, Ezio
2013-11-01
The current paper deals with the mathematical models of predator-prey system where a transmissible disease spreads among the predator species only. Four mathematical models are proposed and analysed with several popular predator functional responses in order to show the influence of functional response on eco-epidemic models. The existence, boundedness, uniqueness of solutions of all the models are established. Mathematical analysis including stability and bifurcation are observed. Comparison among the results of these models allows the general conclusion that relevant behaviour of the eco-epidemic predator-prey system, including switching of stability, extinction, persistence and oscillations for any species depends on four important parameters viz. the rate of infection, predator interspecies competition and the attack rate on susceptible predator. The paper ends with a discussion of the biological implications of the analytical and numerical results. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
ITG: A New Global GNSS Tropospheric Correction Model
Yao, Yibin; Xu, Chaoqian; Shi, Junbo; Cao, Na; Zhang, Bao; Yang, Junjian
2015-01-01
Tropospheric correction models are receiving increasing attentions, as they play a crucial role in Global Navigation Satellite System (GNSS). Most commonly used models to date include the GPT2 series and the TropGrid2. In this study, we analyzed the advantages and disadvantages of existing models and developed a new model called the Improved Tropospheric Grid (ITG). ITG considers annual, semi-annual and diurnal variations, and includes multiple tropospheric parameters. The amplitude and initial phase of diurnal variation are estimated as a periodic function. ITG provides temperature, pressure, the weighted mean temperature (Tm) and Zenith Wet Delay (ZWD). We conducted a performance comparison among the proposed ITG model and previous ones, in terms of meteorological measurements from 698 observation stations, Zenith Total Delay (ZTD) products from 280 International GNSS Service (IGS) station and Tm from Global Geodetic Observing System (GGOS) products. Results indicate that ITG offers the best performance on the whole. PMID:26196963
Numerical modeling of consolidation processes in hydraulically deposited soils
NASA Astrophysics Data System (ADS)
Brink, Nicholas Robert
Hydraulically deposited soils are encountered in many common engineering applications including mine tailing and geotextile tube fills, though the consolidation process for such soils is highly nonlinear and requires the use of advanced numerical techniques to provide accurate predictions. Several commercially available finite element codes poses the ability to model soil consolidation, and it was the goal of this research to assess the ability of two of these codes, ABAQUS and PLAXIS, to model the large-strain, two-dimensional consolidation processes which occur in hydraulically deposited soils. A series of one- and two-dimensionally drained rectangular models were first created to assess the limitations of ABAQUS and PLAXIS when modeling consolidation of highly compressible soils. Then, geotextile tube and TSF models were created to represent actual scenarios which might be encountered in engineering practice. Several limitations were discovered, including the existence of a minimum preconsolidation stress below which numerical solutions become unstable.
Electric-hybrid-vehicle simulation
NASA Astrophysics Data System (ADS)
Pasma, D. C.
The simulation of electric hybrid vehicles is to be performed using experimental data to model propulsion system components. The performance of an existing ac propulsion system will be used as the baseline for comparative purposes. Hybrid components to be evaluated include electrically and mechanically driven flywheels, and an elastomeric regenerative braking system.
General Systems Theory and Instructional Systems Design.
ERIC Educational Resources Information Center
Salisbury, David F.
1990-01-01
Describes basic concepts in the field of general systems theory (GST) and identifies commonalities that exist between GST and instructional systems design (ISD). Models and diagrams that depict system elements in ISD are presented, and two matrices that show how GST has been used in ISD literature are included. (11 references) (LRW)
PDS Work at a Small University: Solutions to Common Problems
ERIC Educational Resources Information Center
Mills, Lynne
2010-01-01
Small universities deal with two primary issues when beginning to use the Professional Development School model: Adequate Funding and Faculty Support. Possible solutions are discussed, including ways to provide adequate funding through grants, enrichment/tutoring programs, reallocation of existing funds, and university priority money, as well as…
2008-12-01
estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources , gathering and maintaining the...31 3. Source Selection...Government Accountability Office GFE Government Furnished Equipment pg GVWR Gross Vehicle Weight Rating H HEMTT Heavy Expanded Mobility
USING THE AIR QUALITY MODEL TO ANALYZE THE CONCENTRATIONS OF AIR TOXICS OVER THE CONTINENTAL U.S.
The U.S. Environmental Protection Agency is examining the concentrations and deposition of hazardous air pollutants (HAPs), which include a large number of chemicals, ranging from non reactive (i.e. carbon tetrachloride) to reactive (i.e. formaldehyde), exist in gas, aqueous, and...
Entering a community dialogue.
Britt, Teri; Player, Kathy; Parsons, Kathleen; Stover, Deanna
2004-01-01
Entering a new, unstructured community is facilitated when existing members embrace the thoughts, ideas, and experiences of new members. Watson's Caring Healing Model, including the caritas conscious and transpersonal caring components, provides a framework for understanding the experience of being a new community member (e.g., a "newbie") in the Global Nursing Exchange.
ERIC Educational Resources Information Center
Price, Thomas S.; Jaffee, Sara R.
2008-01-01
The classical twin study provides a useful resource for testing hypotheses about how the family environment influences children's development, including how genes can influence sensitivity to environmental effects. However, existing statistical models do not account for the possibility that children can inherit exposure to family environments…
Improvement of High-Resolution Tropical Cyclone Structure and Intensity Forecasts using COAMPS-TC
2013-09-30
scientific community including the recent T- PARC /TCS08, ITOP, and HS3 field campaigns to build upon the existing modeling capabilities. We will...heating and cooling rates in developing and non-developing tropical disturbances during tcs-08: radar -equivalent retrievals from mesoscale numerical
STEAM by Another Name: Transdisciplinary Practice in Art and Design Education
ERIC Educational Resources Information Center
Costantino, Tracie
2018-01-01
The recent movement to include art and design in Science, Technology, Engineering, and Mathematics (STEM) education has made Science, Technology, Engineering, Arts, and Mathematics (STEAM) an increasingly common acronym in the education lexicon. The STEAM movement builds on existing models of interdisciplinary curriculum, but what makes the union…
Examining Factors That Affect Students' Knowledge Sharing within Virtual Teams
ERIC Educational Resources Information Center
He, Jinxia; Gunter, Glenda
2015-01-01
The purpose of this study was to examine factors that might impact student knowledge sharing within virtual teams through online discussion boards. These factors include: trust, mutual influence, conflict, leadership, and cohesion. A path model was developed to determine whether relationships exist among knowledge sharing from asynchronous group…
Brain Evolution: The Origins of Social and Cognitive Behaviors.
ERIC Educational Resources Information Center
MacLean, Paul
1983-01-01
Argues that common anatomical and functional characteristics exist among the brains of reptiles, mammals, and man--the most significant commonality for educators being social behavior. Illustrates inherited behavior, including behavior observed in classroom and believed to be learned by placing it in context of a model "triune"…
The Artist's View of Points and Lines.
ERIC Educational Resources Information Center
Millman, Richard S.; Speranza, Ramona R.
1991-01-01
Presented is the idea that art can be used to present early concepts of geometry, including the notion of the infinite. Discussed is the symbiosis that exists between the artistic and mathematical views of points, lines, and planes. Geometric models in art and using art in the classroom are discussed. (KR)
Community-Based Participatory Study Abroad: A Proposed Model for Social Work Education
ERIC Educational Resources Information Center
Fisher, Colleen M.; Grettenberger, Susan E.
2015-01-01
Study abroad experiences offer important benefits for social work students and faculty, including global awareness, practice skill development, and enhanced multicultural competence. Short-term study abroad programs are most feasible but typically lack depth of engagement with host communities and may perpetuate existing systems of power and…
Integrated Workforce Modeling System
NASA Technical Reports Server (NTRS)
Moynihan, Gary P.
2000-01-01
There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.
A comprehensive surface-groundwater flow model
NASA Astrophysics Data System (ADS)
Arnold, Jeffrey G.; Allen, Peter M.; Bernhardt, Gilbert
1993-02-01
In this study, a simple groundwater flow and height model was added to an existing basin-scale surface water model. The linked model is: (1) watershed scale, allowing the basin to be subdivided; (2) designed to accept readily available inputs to allow general use over large regions; (3) continuous in time to allow simulation of land management, including such factors as climate and vegetation changes, pond and reservoir management, groundwater withdrawals, and stream and reservoir withdrawals. The model is described, and is validated on a 471 km 2 watershed near Waco, Texas. This linked model should provide a comprehensive tool for water resource managers in development and planning.
Predicting the sky from 30 MHz to 800 GHz: the extended Global Sky Model
NASA Astrophysics Data System (ADS)
Liu, Adrian
We propose to construct the extended Global Sky Model (eGSM), a software package and associated data products that are capable of generating maps of the sky at any frequency within a broad range (30 MHz to 800 GHz). The eGSM is constructed from archival data, and its outputs will include not only "best estimate" sky maps, but also accurate error bars and the ability to generate random realizations of missing modes in the input data. Such views of the sky are crucial in the practice of precision cosmology, where our ability to constrain cosmological parameters and detect new phenomena (such as B-mode signatures from primordial gravitational waves, or spectral distortions of the Cosmic Microwave Background; CMB) rests crucially on our ability to remove systematic foreground contamination. Doing so requires empirical measurements of the foreground sky brightness (such as that arising from Galactic synchrotron radiation, among other sources), which are typically performed only at select narrow wavelength ranges. We aim to transcend traditional wavelength limits by optimally combining existing data to provide a comprehensive view of the foreground sky at any frequency within the broad range of 30 MHz to 800 GHz. Previous efforts to interpolate between multi-frequency maps resulted in the Global Sky Model (GSM) of de Oliveira-Costa et al. (2008), a software package that outputs foreground maps at any frequency of the user's choosing between 10 MHz and 100 GHz. However, the GSM has a number of shortcomings. First and foremost, the GSM does not include the latest archival data from the Planck satellite. Multi-frequency models depend crucially on data from Planck, WMAP, and COBE to provide high-frequency "anchor" maps. Another crucial shortcoming is the lack of error bars in the output maps. Finally, the GSM is only able to predict temperature (i.e., total intensity) maps, and not polarization information. With the recent release of Planck's polarized data products, the time is ripe for the inclusion of polarization and a general update of the GSM. In its first two phases, our proposed eGSM project will incorporate new data and improve analysis methods to eliminate all of the aforementioned flaws. The eGSM will have broad implications for future cosmological probes, including surveys of the highly redshifted 21 cm line (such as the proposed Dark Ages Radio Explorer satellite mission) and CMB experiments (such as the Primordial Inflation Polarization Explorer and the Primordial Inflation Explorer) targeting primordial B-mode polarization or spectral distortions. Forecasting exercises for such future experiments must include polarized foregrounds below current detection limits. The third phase of the eGSM will result in a software package that provides random realizations of dim polarized foregrounds that are below the sensitivities of current instruments. This requires the quantification of non-Gaussian and non-isotropic statistics of existing foreground surveys, adding value to existing archival maps. eGSM data products will be publicly hosted on the Legacy Archive for Microwave Background Data Analysis (LAMBDA) archive, including a publicly released code that enables future foreground surveys (whether ground-based or space-based) to easily incorporate additional data into the existing archive, further refining our model and maximizing the impact of existing archives beyond the lifetime of this proposal.
Nuclear fuel requirements for the American economy - A model
NASA Astrophysics Data System (ADS)
Curtis, Thomas Dexter
A model is provided to determine the amounts of various fuel streams required to supply energy from planned and projected nuclear plant operations, including new builds. Flexible, user-defined scenarios can be constructed with respect to energy requirements, choices of reactors and choices of fuels. The model includes interactive effects and extends through 2099. Outputs include energy provided by reactors, the number of reactors, and masses of natural Uranium and other fuels used. Energy demand, including electricity and hydrogen, is obtained from US DOE historical data and projections, along with other studies of potential hydrogen demand. An option to include other energy demand to nuclear power is included. Reactor types modeled include (thermal reactors) PWRs, BWRs and MHRs and (fast reactors) GFRs and SFRs. The MHRs (VHTRs), GFRs and SFRs are similar to those described in the 2002 DOE "Roadmap for Generation IV Nuclear Energy Systems." Fuel source choices include natural Uranium, self-recycled spent fuel, Plutonium from breeder reactors and existing stockpiles of surplus HEU, military Plutonium, LWR spent fuel and depleted Uranium. Other reactors and fuel sources can be added to the model. Fidelity checks of the model's results indicate good agreement with historical Uranium use and number of reactors, and with DOE projections. The model supports conclusions that substantial use of natural Uranium will likely continue to the end of the 21st century, though legacy spent fuel and depleted uranium could easily supply all nuclear energy demand by shifting to predominant use of fast reactors.
NASA Astrophysics Data System (ADS)
Jöckel, P.; Sander, R.; Kerkweg, A.; Tost, H.; Lelieveld, J.
2005-02-01
The development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes, requires coupling of the different domains (land, ocean, atmosphere, ...). One strategy is to link existing domain-specific models with a universal coupler, i.e. an independent standalone program organizing the communication between other programs. In many cases, however, a much simpler approach is more feasible. We have developed the Modular Earth Submodel System (MESSy). It comprises (1) a modular interface structure to connect to a , (2) an extendable set of such for miscellaneous processes, and (3) a coding standard. MESSy is therefore not a coupler in the classical sense, but exchanges data between a and several within one comprehensive executable. The internal complexity of the is controllable in a transparent and user friendly way. This provides remarkable new possibilities to study feedback mechanisms (by two-way coupling). Note that the MESSy and the coupler approach can be combined. For instance, an atmospheric model implemented according to the MESSy standard could easily be coupled to an ocean model by means of an external coupler. The vision is to ultimately form a comprehensive ESM which includes a large set of submodels, and a base model which contains only a central clock and runtime control. This can be reached stepwise, since each process can be included independently. Starting from an existing model, process submodels can be reimplemented according to the MESSy standard. This procedure guarantees the availability of a state-of-the-art model for scientific applications at any time of the development. In principle, MESSy can be implemented into any kind of model, either global or regional. So far, the MESSy concept has been applied to the general circulation model ECHAM5 and a number of process boxmodels.
Automation of Ocean Product Metrics
2008-09-30
Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data
2017-11-22
Weather Research and Forecasting Model Simulations by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this
Gravitational wave, collider and dark matter signals from a scalar singlet electroweak baryogenesis
Beniwal, Ankit; Lewicki, Marek; Wells, James D.; ...
2017-08-23
We analyse a simple extension of the SM with just an additional scalar singlet coupled to the Higgs boson. Here, we discuss the possible probes for electroweak baryogenesis in this model including collider searches, gravitational wave and direct dark matter detection signals. We show that a large portion of the model parameter space exists where the observation of gravitational waves would allow detection while the indirect collider searches would not.
Gravitational wave, collider and dark matter signals from a scalar singlet electroweak baryogenesis
NASA Astrophysics Data System (ADS)
Beniwal, Ankit; Lewicki, Marek; Wells, James D.; White, Martin; Williams, Anthony G.
2017-08-01
We analyse a simple extension of the SM with just an additional scalar singlet coupled to the Higgs boson. We discuss the possible probes for electroweak baryogenesis in this model including collider searches, gravitational wave and direct dark matter detection signals. We show that a large portion of the model parameter space exists where the observation of gravitational waves would allow detection while the indirect collider searches would not.
2006-07-01
precision of the determination of Rmax, we established a refined method based on the model of bubble formation described above in section 3.6.1 and the...development can be modeled by hydrodynamic codes based on tabulated equation-of-state data . This has previously demonstrated on ps optical breakdown...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
Gravitational wave, collider and dark matter signals from a scalar singlet electroweak baryogenesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beniwal, Ankit; Lewicki, Marek; Wells, James D.
We analyse a simple extension of the SM with just an additional scalar singlet coupled to the Higgs boson. Here, we discuss the possible probes for electroweak baryogenesis in this model including collider searches, gravitational wave and direct dark matter detection signals. We show that a large portion of the model parameter space exists where the observation of gravitational waves would allow detection while the indirect collider searches would not.
Calibration Against the Moon. I: A Disk-Resolved Lunar Model for Absolute Reflectance Calibration
2010-01-01
average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Calibration against the Moon I: A disk- resolved lunar model for absolute reflectance...of the disk- resolved Moon at visible to near infrared wavelengths. It has been developed in order to use the Moon as a calibration reference
Fu, Pengcheng; Johnson, Scott M.; Carrigan, Charles R.
2011-01-01
Hydraulic fracturing is currently the primary method for stimulating low-permeability geothermal reservoirs and creating Enhanced (or Engineered) Geothermal Systems (EGS) with improved permeability and heat production efficiency. Complex natural fracture systems usually exist in the formations to be stimulated and it is therefore critical to understand the interactions between existing fractures and newly created fractures before optimal stimulation strategies can be developed. Our study aims to improve the understanding of EGS stimulation-response relationships by developing and applying computer-based models that can effectively reflect the key mechanisms governing interactions between complex existing fracture networks and newly created hydraulic fractures. In this paper, we first briefly describe the key modules of our methodology, namely a geomechanics solver, a discrete fracture flow solver, a rock joint response model, an adaptive remeshing module, and most importantly their effective coupling. After verifying the numerical model against classical closed-form solutions, we investigate responses of reservoirs with different preexisting natural fractures to a variety of stimulation strategies. The factors investigated include: the in situ stress states (orientation of the principal stresses and the degree of stress anisotropy), pumping pressure, and stimulation sequences of multiple wells.
NASA Astrophysics Data System (ADS)
Allison, C. M.; Roggensack, K.; Clarke, A. B.
2017-12-01
Volatile solubility in magmas is dependent on several factors, including composition and pressure. Mafic (basaltic) magmas with high concentrations of alkali elements (Na and K) are capable of dissolving larger quantities of H2O and CO2 than low-alkali basalt. The exsolution of abundant gases dissolved in alkali-rich mafic magmas can contribute to large explosive eruptions. Existing volatile solubility models for alkali-rich mafic magmas are well calibrated below 200 MPa, but at greater pressures the experimental data is sparse. To allow for accurate interpretation of mafic magmatic systems at higher pressures, we conducted a set of mixed H2O-CO2 volatile solubility experiments between 400 and 600 MPa at 1200 °C in six mafic compositions with variable alkali contents. Compositions include magmas from volcanoes in Italy, Antarctica, and Arizona. Results from our experiments indicate that existing volatile solubility models for alkali-rich mafic magmas, if extrapolated beyond their calibrated range, over-predict CO2 solubility at mid-crustal pressures. Physically, these results suggest that volatile exsolution can occur at deeper levels than what can be resolved from the lower-pressure experimental data. Existing thermodynamic models used to calculate volatile solubility at different pressures require two experimentally derived parameters. These parameters represent the partial molar volume of the condensed volatile species in the melt and its equilibrium constant, both calculated at a standard temperature and pressure. We derived these parameters for each studied composition and the corresponding thermodynamic model shows good agreement with the CO2 solubility data of the experiments. A general alkali basalt solubility model was also constructed by establishing a relationship between magma composition and the thermodynamic parameters. We utilize cation fractions from our six compositions along with four compositions from the experimental literature in a linear regression to generate this compositional relationship. Our revised general model provides a new framework to interpret volcanic data, yielding greater depths for melt inclusion entrapment than previously calculated using other models, and it can be applied to mafic magma compositions for which no experimental data is available.
Sensitivity, optimal scaling and minimum roundoff errors in flexible structure models
NASA Technical Reports Server (NTRS)
Skelton, Robert E.
1987-01-01
Traditional modeling notions presume the existence of a truth model that relates the input to the output, without advanced knowledge of the input. This has led to the evolution of education and research approaches (including the available control and robustness theories) that treat the modeling and control design as separate problems. The paper explores the subtleties of this presumption that the modeling and control problems are separable. A detailed study of the nature of modeling errors is useful to gain insight into the limitations of traditional control and identification points of view. Modeling errors need not be small but simply appropriate for control design. Furthermore, the modeling and control design processes are inevitably iterative in nature.
NASA Astrophysics Data System (ADS)
Castagnetti, C.; Dubbini, M.; Ricci, P. C.; Rivola, R.; Giannini, M.; Capra, A.
2017-05-01
The new era of designing in architecture and civil engineering applications lies in the Building Information Modeling (BIM) approach, based on a 3D geometric model including a 3D database. This is easier for new constructions whereas, when dealing with existing buildings, the creation of the BIM is based on the accurate knowledge of the as-built construction. Such a condition is allowed by a 3D survey, often carried out with laser scanning technology or modern photogrammetry, which are able to guarantee an adequate points cloud in terms of resolution and completeness by balancing both time consuming and costs with respect to the request of final accuracy. The BIM approach for existing buildings and even more for historical buildings is not yet a well known and deeply discussed process. There are still several choices to be addressed in the process from the survey to the model and critical issues to be discussed in the modeling step, particularly when dealing with unconventional elements such as deformed geometries or historical elements. The paper describes a comprehensive workflow that goes through the survey and the modeling, allowing to focus on critical issues and key points to obtain a reliable BIM of an existing monument. The case study employed to illustrate the workflow is the Basilica of St. Stefano in Bologna (Italy), a large monumental complex with great religious, historical and architectural assets.
Brown, Jeremiah R; MacKenzie, Todd A; Maddox, Thomas M; Fly, James; Tsai, Thomas T; Plomondon, Mary E; Nielson, Christopher D; Siew, Edward D; Resnic, Frederic S; Baker, Clifton R; Rumsfeld, John S; Matheny, Michael E
2015-12-11
Acute kidney injury (AKI) occurs frequently after cardiac catheterization and percutaneous coronary intervention. Although a clinical risk model exists for percutaneous coronary intervention, no models exist for both procedures, nor do existing models account for risk factors prior to the index admission. We aimed to develop such a model for use in prospective automated surveillance programs in the Veterans Health Administration. We collected data on all patients undergoing cardiac catheterization or percutaneous coronary intervention in the Veterans Health Administration from January 01, 2009 to September 30, 2013, excluding patients with chronic dialysis, end-stage renal disease, renal transplant, and missing pre- and postprocedural creatinine measurement. We used 4 AKI definitions in model development and included risk factors from up to 1 year prior to the procedure and at presentation. We developed our prediction models for postprocedural AKI using the least absolute shrinkage and selection operator (LASSO) and internally validated using bootstrapping. We developed models using 115 633 angiogram procedures and externally validated using 27 905 procedures from a New England cohort. Models had cross-validated C-statistics of 0.74 (95% CI: 0.74-0.75) for AKI, 0.83 (95% CI: 0.82-0.84) for AKIN2, 0.74 (95% CI: 0.74-0.75) for contrast-induced nephropathy, and 0.89 (95% CI: 0.87-0.90) for dialysis. We developed a robust, externally validated clinical prediction model for AKI following cardiac catheterization or percutaneous coronary intervention to automatically identify high-risk patients before and immediately after a procedure in the Veterans Health Administration. Work is ongoing to incorporate these models into routine clinical practice. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Enhancing emotional-based target prediction
NASA Astrophysics Data System (ADS)
Gosnell, Michael; Woodley, Robert
2008-04-01
This work extends existing agent-based target movement prediction to include key ideas of behavioral inertia, steady states, and catastrophic change from existing psychological, sociological, and mathematical work. Existing target prediction work inherently assumes a single steady state for target behavior, and attempts to classify behavior based on a single emotional state set. The enhanced, emotional-based target prediction maintains up to three distinct steady states, or typical behaviors, based on a target's operating conditions and observed behaviors. Each steady state has an associated behavioral inertia, similar to the standard deviation of behaviors within that state. The enhanced prediction framework also allows steady state transitions through catastrophic change and individual steady states could be used in an offline analysis with additional modeling efforts to better predict anticipated target reactions.
Serang, Oliver; Noble, William Stafford
2012-01-01
The problem of identifying the proteins in a complex mixture using tandem mass spectrometry can be framed as an inference problem on a graph that connects peptides to proteins. Several existing protein identification methods make use of statistical inference methods for graphical models, including expectation maximization, Markov chain Monte Carlo, and full marginalization coupled with approximation heuristics. We show that, for this problem, the majority of the cost of inference usually comes from a few highly connected subgraphs. Furthermore, we evaluate three different statistical inference methods using a common graphical model, and we demonstrate that junction tree inference substantially improves rates of convergence compared to existing methods. The python code used for this paper is available at http://noble.gs.washington.edu/proj/fido. PMID:22331862
New envelope solitons for Gerdjikov-Ivanov model in nonlinear fiber optics
NASA Astrophysics Data System (ADS)
Triki, Houria; Alqahtani, Rubayyi T.; Zhou, Qin; Biswas, Anjan
2017-11-01
Exact soliton solutions in a class of derivative nonlinear Schrödinger equations including a pure quintic nonlinearity are investigated. By means of the coupled amplitude-phase formulation, we derive a nonlinear differential equation describing the evolution of the wave amplitude in the non-Kerr quintic media. The resulting amplitude equation is then solved to get exact analytical chirped bright, kink, antikink, and singular soliton solutions for the model. It is also shown that the nonlinear chirp associated with these solitons is crucially dependent on the wave intensity and related to self-steepening and group velocity dispersion parameters. Parametric conditions on physical parameters for the existence of chirped solitons are also presented. These localized structures exist due to a balance among quintic nonlinearity, group velocity dispersion, and self-steepening effects.
A Penalized Robust Method for Identifying Gene-Environment Interactions
Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Xie, Yang; Ma, Shuangge
2015-01-01
In high-throughput studies, an important objective is to identify gene-environment interactions associated with disease outcomes and phenotypes. Many commonly adopted methods assume specific parametric or semiparametric models, which may be subject to model mis-specification. In addition, they usually use significance level as the criterion for selecting important interactions. In this study, we adopt the rank-based estimation, which is much less sensitive to model specification than some of the existing methods and includes several commonly encountered data and models as special cases. Penalization is adopted for the identification of gene-environment interactions. It achieves simultaneous estimation and identification and does not rely on significance level. For computation feasibility, a smoothed rank estimation is further proposed. Simulation shows that under certain scenarios, for example with contaminated or heavy-tailed data, the proposed method can significantly outperform the existing alternatives with more accurate identification. We analyze a lung cancer prognosis study with gene expression measurements under the AFT (accelerated failure time) model. The proposed method identifies interactions different from those using the alternatives. Some of the identified genes have important implications. PMID:24616063
Brian Hears: Online Auditory Processing Using Vectorization Over Channels
Fontaine, Bertrand; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain
2011-01-01
The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in “Brian Hears,” a library for the spiking neural network simulator package “Brian.” This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations. PMID:21811453
An investigation of modelling and design for software service applications
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905
A simple, analytic 3-dimensional downburst model based on boundary layer stagnation flow
NASA Technical Reports Server (NTRS)
Oseguera, Rosa M.; Bowles, Roland L.
1988-01-01
A simple downburst model is developed for use in batch and real-time piloted simulation studies of guidance strategies for terminal area transport aircraft operations in wind shear conditions. The model represents an axisymmetric stagnation point flow, based on velocity profiles from the Terminal Area Simulation System (TASS) model developed by Proctor and satisfies the mass continuity equation in cylindrical coordinates. Altitude dependence, including boundary layer effects near the ground, closely matches real-world measurements, as do the increase, peak, and decay of outflow and downflow with increasing distance from the downburst center. Equations for horizontal and vertical winds were derived, and found to be infinitely differentiable, with no singular points existent in the flow field. In addition, a simple relationship exists among the ratio of maximum horizontal to vertical velocities, the downdraft radius, depth of outflow, and altitude of maximum outflow. In use, a microburst can be modeled by specifying four characteristic parameters, velocity components in the x, y and z directions, and the corresponding nine partial derivatives are obtained easily from the velocity equations.
Long-Period Tidal Variations in the Length of Day
NASA Technical Reports Server (NTRS)
Ray, Richard D.; Erofeeva, Svetlana Y.
2014-01-01
A new model of long-period tidal variations in length of day is developed. The model comprises 80 spectral lines with periods between 18.6 years and 4.7 days, and it consistently includes effects of mantle anelasticity and dynamic ocean tides for all lines. The anelastic properties followWahr and Bergen; experimental confirmation for their results now exists at the fortnightly period, but there remains uncertainty when extrapolating to the longest periods. The ocean modeling builds on recent work with the fortnightly constituent, which suggests that oceanic tidal angular momentum can be reliably predicted at these periods without data assimilation. This is a critical property when modeling most long-period tides, for which little observational data exist. Dynamic ocean effects are quite pronounced at shortest periods as out-of-phase rotation components become nearly as large as in-phase components. The model is tested against a 20 year time series of space geodetic measurements of length of day. The current international standard model is shown to leave significant residual tidal energy, and the new model is found to mostly eliminate that energy, with especially large variance reduction for constituents Sa, Ssa, Mf, and Mt.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
The 2 SOPS Ephemeris Enhancement Endeavor (EEE)
1997-12-01
reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...if it does not display a currently valid OMB control number. 1 . REPORT DATE DEC 1997 2. REPORT TYPE 3. DATES COVERED 00-00-1997 to 00-00-1997 4...deficiencies. They include: 1 . Solar Pressure States. A 1995 study revealed that some deficiencies exist within the solar state model used by the
Tiezzi, F; de Los Campos, G; Parker Gaddis, K L; Maltecca, C
2017-03-01
Genotype by environment interaction (G × E) in dairy cattle productive traits has been shown to exist, but current genetic evaluation methods do not take this component into account. As several environmental descriptors (e.g., climate, farming system) are known to vary within the United States, not accounting for the G × E could lead to reranking of bulls and loss in genetic gain. Using test-day records on milk yield, somatic cell score, fat, and protein percentage from all over the United States, we computed within herd-year-season daughter yield deviations for 1,087 Holstein bulls and regressed them on genetic and environmental information to estimate variance components and to assess prediction accuracy. Genomic information was obtained from a 50k SNP marker panel. Environmental effect inputs included herd (160 levels), geographical region (7 levels), geographical location (2 variables), climate information (7 variables), and management conditions of the herds (16 total variables divided in 4 subgroups). For each set of environmental descriptors, environmental, genomic, and G × E components were sequentially fitted. Variance components estimates confirmed the presence of G × E on milk yield, with its effect being larger than main genetic effect and the environmental effect for some models. Conversely, G × E was moderate for somatic cell score and small for milk composition. Genotype by environment interaction, when included, partially eroded the genomic effect (as compared with the models where G × E was not included), suggesting that the genomic variance could at least in part be attributed to G × E not appropriately accounted for. Model predictive ability was assessed using 3 cross-validation schemes (new bulls, incomplete progeny test, and new environmental conditions), and performance was compared with a reference model including only the main genomic effect. In each scenario, at least 1 of the models including G × E was able to perform better than the reference model, although it was not possible to find the overall best-performing model that included the same set of environmental descriptors. In general, the methodology used is promising in accounting for G × E in genomic predictions, but challenges exist in identifying a unique set of covariates capable of describing the entire variety of environments. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Modeling Array Stations in SIG-VISA
NASA Astrophysics Data System (ADS)
Ding, N.; Moore, D.; Russell, S.
2013-12-01
We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.
Laser Induced Aluminum Surface Breakdown Model
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Zhang, Sijun; Wang, Ten-See (Technical Monitor)
2002-01-01
Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Based on an unstructured grid, pressure-based computational aerothermodynamics; platform, several sub-models describing such underlying physics as laser ray tracing and focusing, thermal non-equilibrium, plasma radiation and air spark ignition have been developed. This proposed work shall extend the numerical platform and existing sub-models to include the aluminum wall surface Inverse Bremsstrahlung (IB) effect from which surface ablation and free-electron generation can be initiated without relying on the air spark ignition sub-model. The following tasks will be performed to accomplish the research objectives.
NASA Technical Reports Server (NTRS)
Schwan, Karsten
1994-01-01
Atmospheric modeling is a grand challenge problem for several reasons, including its inordinate computational requirements and its generation of large amounts of data concurrent with its use of very large data sets derived from measurement instruments like satellites. In addition, atmospheric models are typically run several times, on new data sets or to reprocess existing data sets, to investigate or reinvestigate specific chemical or physical processes occurring in the earth's atmosphere, to understand model fidelity with respect to observational data, or simply to experiment with specific model parameters or components.
Modelling of additive manufacturing processes: a review and classification
NASA Astrophysics Data System (ADS)
Stavropoulos, Panagiotis; Foteinopoulos, Panagis
2018-03-01
Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.
Assessment of Higher-Order RANS Closures in a Decelerated Planar Wall-Bounded Turbulent Flow
NASA Technical Reports Server (NTRS)
Jeyapaul, Elbert; Coleman, Gary N.; Rumsey, Christopher L.
2014-01-01
A reference DNS database is presented, which includes third- and fourth-order moment budgets for unstrained and strained planar channel flow. Existing RANS closure models for third- and fourth-order terms are surveyed, and new model ideas are introduced. The various models are then compared with the DNS data term by term using a priori testing of the higher-order budgets of turbulence transport, velocity-pressure-gradient, and dissipation for both the unstrained and strained databases. Generally, the models for the velocity-pressure-gradient terms are most in need of improvement.
Coteaching in physical education: a strategy for inclusive practice.
Grenier, Michelle A
2011-04-01
Qualitative research methods were used to explore the factors that informed general and adapted physical education teachers' coteaching practices within an inclusive high school physical education program. Two physical education teachers and one adapted physical education teacher were observed over a 16-week period. Interviews, field notes, and documents were collected and a constant comparative approach was used in the analysis that adopted a social model framework. Primary themes included community as the cornerstone for student learning, core values of trust and respect, and creating a natural support structure. Coteaching practices existed because of the shared values of teaching, learning, and the belief that all students should be included. Recommendations include shifting orientations within professional preparation programs to account for the social model of disability.
Mejlholm, Ole; Dalgaard, Paw
2013-10-15
A new and extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. was developed and validated for processed and unprocessed products of seafood and meat. The new model was developed by refitting and expanding an existing cardinal parameter model for growth and the growth boundary of lactic acid bacteria (LAB) in processed seafood (O. Mejlholm and P. Dalgaard, J. Food Prot. 70. 2485-2497, 2007). Initially, to estimate values for the maximum specific growth rate at the reference temperature of 25 °C (μref) and the theoretical minimum temperature that prevents growth of psychrotolerant LAB (T(min)), the existing LAB model was refitted to data from experiments with seafood and meat products reported not to include nitrite or any of the four organic acids evaluated in the present study. Next, dimensionless terms modelling the antimicrobial effect of nitrite, and acetic, benzoic, citric and sorbic acids on growth of Lactobacillus sakei were added to the refitted model, together with minimum inhibitory concentrations determined for the five environmental parameters. The new model including the effect of 12 environmental parameters, as well as their interactive effects, was successfully validated using 229 growth rates (μ(max) values) for psychrotolerant Lactobacillus spp. in seafood and meat products. Average bias and accuracy factor values of 1.08 and 1.27, respectively, were obtained when observed and predicted μ(max) values of psychrotolerant Lactobacillus spp. were compared. Thus, on average μ(max) values were only overestimated by 8%. The performance of the new model was equally good for seafood and meat products, and the importance of including the effect of acetic, benzoic, citric and sorbic acids and to a lesser extent nitrite in order to accurately predict growth of psychrotolerant Lactobacillus spp. was clearly demonstrated. The new model can be used to predict growth of psychrotolerant Lactobacillus spp. in seafood and meat products e.g. prediction of the time to a critical cell concentration of bacteria is considered useful for establishing the shelf life. In addition, the high number of environmental parameters included in the new model makes it flexible and suitable for product development as the effect of substituting one combination of preservatives with another can be predicted. In general, the performance of the new model was unacceptable for other types of LAB including Carnobacterium spp., Leuconostoc spp. and Weissella spp. © 2013.
NASA Astrophysics Data System (ADS)
Pan, M.; Wood, E. F.
2004-05-01
This study explores a method to estimate various components of the water cycle (ET, runoff, land storage, etc.) based on a number of different info sources, including both observations and observation-enhanced model simulations. Different from existing data assimilations, this constrained Kalman filtering approach keeps the water budget perfectly closed while updating the states of the underlying model (VIC model) optimally using observations. Assimilating different data sources in this way has several advantages: (1) physical model is included to make estimation time series smooth, missing-free, and more physically consistent; (2) uncertainties in the model and observations are properly addressed; (3) model is constrained by observation thus to reduce model biases; (4) balance of water is always preserved along the assimilation. Experiments are carried out in Southern Great Plain region where necessary observations have been collected. This method may also be implemented in other applications with physical constraints (e.g. energy cycles) and at different scales.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
Probabilistic computer model of optimal runway turnoffs
NASA Technical Reports Server (NTRS)
Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.
1985-01-01
Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.
Building an award-winning women's health ambulatory service and beyond.
Allen, Lisa W; Maxwell, Susan; Greene, John F
2003-01-01
Many barriers exist for the provision of high-quality health care to inner-city minority women. The barriers include access to care, compliance problems, financial concerns, system navigation issues, as well as language barriers. This article describes the transition of the Women's Ambulatory Health Services at Hartford Hospital from a traditional clinic model to a culturally sensitive private practice model. The road to transition was paved by valuable input from staff as well as patients. The final product was a much more efficient, inviting model that catered to the needs of the community.
Women’s Sexuality: Behaviors, Responses, and Individual Differences
Andersen, Barbara L.; Cyranowski, Jill M.
2009-01-01
Classic and contemporary approaches to the assessment of female sexuality are discussed. General approaches, assessment strategies, and models of female sexuality are organized within the conceptual domains of sexual behaviors, sexual responses (desire, excitement, orgasm, and resolution), and individual differences, including general and sex-specific personality models. Where applicable, important trends and relationships are highlighted in the literature with both existing reports and previously unpublished data. The present conceptual overview highlights areas in sexual assessment and model building that are in need of further research and theoretical clarification. PMID:8543712
The lunar crust - A product of heterogeneous accretion or differentiation of a homogeneous moon
NASA Technical Reports Server (NTRS)
Brett, R.
1973-01-01
The outer portion of the moon (including the aluminum-rich crust and the source regions of mare basalts) was either accreted heterogeneously or was the product of widespread differentiation of an originally homogeneous source. Existing evidence for and against each of these two models is reviewed. It is concluded that the accretionary model presents more problems than it solves, and the model involving differentiation of an originally homogeneous moon is considered to be more plausible. A hypothesis for the formation of mare basalts is advanced.
Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z
2017-03-01
In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.
Attaining minimally disruptive medicine: context, challenges and a roadmap for implementation.
Shippee, N D; Allen, S V; Leppin, A L; May, C R; Montori, V M
2015-01-01
In this second of two papers on minimally disruptive medicine, we use the language of patient workload and patient capacity from the Cumulative Complexity Model to accomplish three tasks. First, we outline the current context in healthcare, comprised of contrasting problems: some people lack access to care and others receive too much care in an overmedicalised system, both of which reflect imbalances between patients' workloads and their capacity. Second, we identify and address five tensions and challenges between minimally disruptive medicine, the existing context, and other approaches to accessible and patient-centred care such as evidence-based medicine and greater patient engagement. Third, we outline a roadmap of three strategies toward implementing minimally disruptive medicine in practice, including large-scale paradigm shifts, mid-level add-ons to existing reform efforts, and a modular strategy using an existing 'toolkit' that is more limited in scope, but can fit into existing healthcare systems.
Tools for visually exploring biological networks.
Suderman, Matthew; Hallett, Michael
2007-10-15
Many tools exist for visually exploring biological networks including well-known examples such as Cytoscape, VisANT, Pathway Studio and Patika. These systems play a key role in the development of integrative biology, systems biology and integrative bioinformatics. The trend in the development of these tools is to go beyond 'static' representations of cellular state, towards a more dynamic model of cellular processes through the incorporation of gene expression data, subcellular localization information and time-dependent behavior. We provide a comprehensive review of the relative advantages and disadvantages of existing systems with two goals in mind: to aid researchers in efficiently identifying the appropriate existing tools for data visualization; to describe the necessary and realistic goals for the next generation of visualization tools. In view of the first goal, we provide in the Supplementary Material a systematic comparison of more than 35 existing tools in terms of over 25 different features. Supplementary data are available at Bioinformatics online.
A New Framework for Cumulus Parametrization - A CPT in action
NASA Astrophysics Data System (ADS)
Jakob, C.; Peters, K.; Protat, A.; Kumar, V.
2016-12-01
The representation of convection in climate model remains a major Achilles Heel in our pursuit of better predictions of global and regional climate. The basic principle underpinning the parametrisation of tropical convection in global weather and climate models is that there exist discernible interactions between the resolved model scale and the parametrised cumulus scale. Furthermore, there must be at least some predictive power in the larger scales for the statistical behaviour on small scales for us to be able to formally close the parametrised equations. The presentation will discuss a new framework for cumulus parametrisation based on the idea of separating the prediction of cloud area from that of velocity. This idea is put into practice by combining an existing multi-scale stochastic cloud model with observations to arrive at the prediction of the area fraction for deep precipitating convection. Using mid-tropospheric humidity and vertical motion as predictors, the model is shown to reproduce the observed behaviour of both mean and variability of deep convective area fraction well. The framework allows for the inclusion of convective organisation and can - in principle - be made resolution-aware or resolution-independent. When combined with simple assumptions about cloud-base vertical motion the model can be used as a closure assumption in any existing cumulus parametrisation. Results of applying this idea in the the ECHAM model indicate significant improvements in the simulation of tropical variability, including but not limited to the MJO. This presentation will highlight how the close collaboration of the observational, theoretical and model development community in the spirit of the climate process teams can lead to significant progress in long-standing issues in climate modelling while preserving the freedom of individual groups in pursuing their specific implementation of an agreed framework.
Risk factors for disability discharge in enlisted active duty Army soldiers.
Piccirillo, Amanda L; Packnett, Elizabeth R; Cowan, David N; Boivin, Michael R
2016-04-01
The rate of permanent disability retirement in U.S. Army soldiers and the prevalence of combat-related disabilities have significantly increased over time. Prior research on risk factors associated with disability retirement included soldiers retired prior to conflicts in Iraq and Afghanistan. To identify risk factors for disability discharge among soldiers enlisted in the U.S. Army during military operations in Iraq and Afghanistan. In this case-control study, cases included active duty soldiers evaluated for disability discharge. Controls, randomly selected from soldiers with no history of disability evaluation, were matched to cases based on enlistment year and sex. Conditional logistic regression models calculated odds of disability discharge. Attributable fractions estimated burden of disability for specific pre-existing condition categories. Poisson regression models compared risk of disability discharge related to common disability types by deployment and combat status. Characteristics at military enlistment with increased odds of disability discharge included a pre-existing condition, increased age or body mass index, white race, and being divorced. Musculoskeletal conditions and overweight contributed the largest proportion of disabilities. Deployment was protective against disability discharge or receiving a musculoskeletal-related disability, but significantly increased the risk of disability related to a psychiatric or neurological condition. Soldiers with a pre-existing condition at enlistment, particularly a musculoskeletal condition, had increased odds of disability discharge. Risk of disability was dependent on condition category when stratified by deployment and combat status. Additional research examining conditions during pre-disability hospitalizations could provide insight on specific conditions that commonly lead to disability discharge. Copyright © 2016 Elsevier Inc. All rights reserved.
Management of fluid mud in estuaries, bays, and lakes. II: Measurement, modeling, and management
McAnally, W.H.; Teeter, A.; Schoellhamer, David H.; Friedrichs, C.; Hamilton, D.; Hayter, E.; Shrestha, P.; Rodriguez, H.; Sheremet, A.; Kirby, R.
2007-01-01
Techniques for measurement, modeling, and management of fluid mud are available, but research is needed to improve them. Fluid mud can be difficult to detect, measure, or sample, which has led to new instruments and new ways of using existing instruments. Multifrequency acoustic fathometers sense neither density nor viscosity and are, therefore, unreliable in measuring fluid mud. Nuclear density probes, towed sleds, seismic, and drop probes equipped with density meters offer the potential for accurate measurements. Numerical modeling of fluid mud requires solving governing equations for flow velocity, density, pressure, salinity, water surface, plus sediment submodels. A number of such models exist in one-, two-, and three-dimensional form, but they rely on empirical relationships that require substantial site-specific validation to observations. Management of fluid mud techniques can be classified as those that accomplish: Source control, formation control, and removal. Nautical depth, a fourth category, defines the channel bottom as a specific fluid mud density or alternative parameter as safe for navigation. Source control includes watershed management measures to keep fine sediment out of waterways and in-water measures such as structures and traps. Formation control methods include streamlined channels and structures plus other measures to reduce flocculation and structures that train currents. Removal methods include the traditional dredging and transport of dredged material plus agitation that contributes to formation control and/or nautical depth. Conditioning of fluid mud by dredging and aerating offers the possibility of improved navigability. Two examples—the Atchafalaya Bar Channel and Savannah Harbor—illustrate the use of measurements and management of fluid mud.
SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.; Dunn, D.
2010-09-07
Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Recent advances in the modelling of crack growth under fatigue loading conditions
NASA Technical Reports Server (NTRS)
Dekoning, A. U.; Tenhoeve, H. J.; Henriksen, T. K.
1994-01-01
Fatigue crack growth associated with cyclic (secondary) plastic flow near a crack front is modelled using an incremental formulation. A new description of threshold behaviour under small load cycles is included. Quasi-static crack extension under high load excursions is described using an incremental formulation of the R-(crack growth resistance)- curve concept. The integration of the equations is discussed. For constant amplitude load cycles the results will be compared with existing crack growth laws. It will be shown that the model also properly describes interaction effects of fatigue crack growth and quasi-static crack extension. To evaluate the more general applicability the model is included in the NASGRO computer code for damage tolerance analysis. For this purpose the NASGRO program was provided with the CORPUS and the STRIP-YIELD models for computation of the crack opening load levels. The implementation is discussed and recent results of the verification are presented.
Modeling in Big Data Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Szymczak, Samantha; Gunning, Dave
Human-Centered Big Data Research (HCBDR) is an area of work that focuses on the methodologies and research areas focused on understanding how humans interact with “big data”. In the context of this paper, we refer to “big data” in a holistic sense, including most (if not all) the dimensions defining the term, such as complexity, variety, velocity, veracity, etc. Simply put, big data requires us as researchers of to question and reconsider existing approaches, with the opportunity to illuminate new kinds of insights that were traditionally out of reach to humans. The purpose of this article is to summarize themore » discussions and ideas about the role of models in HCBDR at a recent workshop. Models, within the context of this paper, include both computational and conceptual mental models. As such, the discussions summarized in this article seek to understand the connection between these two categories of models.« less
A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.
2015-12-01
A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shahidehpour, Mohammad
Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less
Kim, Choong-Ki; Toft, Jodie E; Papenfus, Michael; Verutes, Gregory; Guerry, Anne D; Ruckelshaus, Marry H; Arkema, Katie K; Guannel, Gregory; Wood, Spencer A; Bernhardt, Joanna R; Tallis, Heather; Plummer, Mark L; Halpern, Benjamin S; Pinsky, Malin L; Beck, Michael W; Chan, Francis; Chan, Kai M A; Levin, Phil S; Polasky, Stephen
2012-01-01
Many hope that ocean waves will be a source for clean, safe, reliable and affordable energy, yet wave energy conversion facilities may affect marine ecosystems through a variety of mechanisms, including competition with other human uses. We developed a decision-support tool to assist siting wave energy facilities, which allows the user to balance the need for profitability of the facilities with the need to minimize conflicts with other ocean uses. Our wave energy model quantifies harvestable wave energy and evaluates the net present value (NPV) of a wave energy facility based on a capital investment analysis. The model has a flexible framework and can be easily applied to wave energy projects at local, regional, and global scales. We applied the model and compatibility analysis on the west coast of Vancouver Island, British Columbia, Canada to provide information for ongoing marine spatial planning, including potential wave energy projects. In particular, we conducted a spatial overlap analysis with a variety of existing uses and ecological characteristics, and a quantitative compatibility analysis with commercial fisheries data. We found that wave power and harvestable wave energy gradually increase offshore as wave conditions intensify. However, areas with high economic potential for wave energy facilities were closer to cable landing points because of the cost of bringing energy ashore and thus in nearshore areas that support a number of different human uses. We show that the maximum combined economic benefit from wave energy and other uses is likely to be realized if wave energy facilities are sited in areas that maximize wave energy NPV and minimize conflict with existing ocean uses. Our tools will help decision-makers explore alternative locations for wave energy facilities by mapping expected wave energy NPV and helping to identify sites that provide maximal returns yet avoid spatial competition with existing ocean uses.
Radiation Detection Computational Benchmark Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing differentmore » techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for compilation. This is a report describing the details of the selected Benchmarks and results from various transport codes.« less
Kim, Choong-Ki; Toft, Jodie E.; Papenfus, Michael; Verutes, Gregory; Guerry, Anne D.; Ruckelshaus, Marry H.; Arkema, Katie K.; Guannel, Gregory; Wood, Spencer A.; Bernhardt, Joanna R.; Tallis, Heather; Plummer, Mark L.; Halpern, Benjamin S.; Pinsky, Malin L.; Beck, Michael W.; Chan, Francis; Chan, Kai M. A.; Levin, Phil S.; Polasky, Stephen
2012-01-01
Many hope that ocean waves will be a source for clean, safe, reliable and affordable energy, yet wave energy conversion facilities may affect marine ecosystems through a variety of mechanisms, including competition with other human uses. We developed a decision-support tool to assist siting wave energy facilities, which allows the user to balance the need for profitability of the facilities with the need to minimize conflicts with other ocean uses. Our wave energy model quantifies harvestable wave energy and evaluates the net present value (NPV) of a wave energy facility based on a capital investment analysis. The model has a flexible framework and can be easily applied to wave energy projects at local, regional, and global scales. We applied the model and compatibility analysis on the west coast of Vancouver Island, British Columbia, Canada to provide information for ongoing marine spatial planning, including potential wave energy projects. In particular, we conducted a spatial overlap analysis with a variety of existing uses and ecological characteristics, and a quantitative compatibility analysis with commercial fisheries data. We found that wave power and harvestable wave energy gradually increase offshore as wave conditions intensify. However, areas with high economic potential for wave energy facilities were closer to cable landing points because of the cost of bringing energy ashore and thus in nearshore areas that support a number of different human uses. We show that the maximum combined economic benefit from wave energy and other uses is likely to be realized if wave energy facilities are sited in areas that maximize wave energy NPV and minimize conflict with existing ocean uses. Our tools will help decision-makers explore alternative locations for wave energy facilities by mapping expected wave energy NPV and helping to identify sites that provide maximal returns yet avoid spatial competition with existing ocean uses. PMID:23144824
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Furukawa, S.
1975-01-01
Current applications of simulation models for clinical research described included tilt model simulation of orthostatic intolerance with hemorrhage, and modeling long term circulatory circulation. Current capabilities include: (1) simulation of analogous pathological states and effects of abnormal environmental stressors by the manipulation of system variables and changing inputs in various sequences; (2) simulation of time courses of responses of controlled variables by the altered inputs and their relationships; (3) simulation of physiological responses of treatment such as isotonic saline transfusion; (4) simulation of the effectiveness of a treatment as well as the effects of complication superimposed on an existing pathological state; and (5) comparison of the effectiveness of various treatments/countermeasures for a given pathological state. The feasibility of applying simulation models to diagnostic and therapeutic research problems is assessed.
The dynamics of a delayed predator-prey model with state dependent feedback control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Anuraj; Gakkhar, Sunita
2011-11-30
A delayed prey-predator model with state-dependent impulses is investigated. The sufficient conditions of existence and stability of semi-trivial solution and positive period-1 solution are obtained by using the Poincare map and analogue of the Poincare Criterion. The qualitative analysis shows that the positive period-one solution bifurcates from the semi-trivial solution through a fold bifurcation. The complex dynamics including chaos is obtained and numerical simulations substantiate the analytical results.
Case study modeling of turbulent and mesoscale fluxes over the BOREAS region
Vidale, P.L.; Pielke, R.A.; Steyaert, L.T.; Barr, A.
1997-01-01
Results from aircraft and surface observations provided evidence for the existence of mesoscale circulations over the Boreal Ecosystem-Atmosphere Study (BOREAS) domain. Using an integrated approach that included the use of analytical modeling, numerical modeling, and data analysis, we have found that there are substantial contributions to the total budgets of heat over the BOREAS domain generated by mesoscale circulations. This effect is largest when the synoptic flow is relatively weak, yet it is present under less favorable conditions, as shown by the case study presented here. While further analysis is warranted to document this effect, the existence of mesoscale flow is not surprising, since it is related to the presence of landscape patches, including lakes, which are of a size on the order of the local Rossby radius and which have spatial differences in maximum sensible heat flux of about 300 W m-2. We have also analyzed the vertical temperature profile simulated in our case study as well as high-resolution soundings and we have found vertical profiles of temperature change above the boundary layer height, which we attribute in part to mesoscale contributions. Our conclusion is that in regions with organized landscapes, such as BOREAS, even with relatively strong synoptic winds, dynamical scaling criteria should be used to assess whether mesoscale effects should be parameterized or explicitly resolved in numerical models of the atmosphere.
Cost-Effectiveness of Evaluating the New Technologies.
ERIC Educational Resources Information Center
Kastner, Theodore A.
1997-01-01
This commentary on a study comparing use of the brand name drug Depakene with generic valproic acid to control seizures in people with mental retardation focuses on issues of cost-effectiveness. It notes existing guidelines for pharmacoeconomic evaluation and suggests a possible model to include a threshold price (per quality-adjusted life year)…
The Benefits of Frequent Positive Affect: Does Happiness Lead to Success?
ERIC Educational Resources Information Center
Lyubomirsky, Sonja; King, Laura; Diener, Ed
2005-01-01
Numerous studies show that happy individuals are successful across multiple life domains, including marriage, friendship, income, work performance, and health. The authors suggest a conceptual model to account for these findings, arguing that the happiness-success link exists not only because success makes people happy, but also because positive…
ERIC Educational Resources Information Center
Roseth, Cary; Akcaoglu, Mete; Zellner, Andrea
2013-01-01
Online education is often assumed to be synonymous with asynchronous instruction, existing apart from or supplementary to face-to-face instruction in traditional bricks-and-mortar classrooms. However, expanding access to computer-mediated communication technologies now make new models possible, including distance learners synchronous online…
Are You Ready to Take the Plunge? Create an Amusement Park.
ERIC Educational Resources Information Center
Mueller, Andrea; Brown, Rod
2000-01-01
Describes an activity on charting 6th and 7th grade students' ideas about a potential science project. Summarizes a five week project on creating a new ride or redesigning existing rides in an amusement park, including research and sketches, final drawings, models of rides, and class presentations. (YDS)
ERIC Educational Resources Information Center
LaMagna, Michael; Hartman-Caverly, Sarah; Marchetti, Lori
2016-01-01
As academic institutions continue to renovate and remodel existing libraries to include colocated services, it is important to understand how this new environment requires the redefining of traditional library roles and responsibilities. This case study examines how Delaware County Community College redefined reference and research service by…
Collaboration as School Reform: Are There Patterns in the Chaos of Planning with Teachers?
ERIC Educational Resources Information Center
Kimmel, Sue C.
2012-01-01
Emphasis on collaboration is a significant thrust in both current school reform and school librarianship. Planning for instruction is generally included in various definitions and models of collaboration. Some research exists about individual planning done in isolation (Warren 2000), but little is known about teachers' planning with other…
Attention, Affect and Learning. Newland Papers: Number 13.
ERIC Educational Resources Information Center
Gear, Jane
A new interactive model of attention, perception, memory, and arousal is introduced; and its use in assessing characteristics of the perceptual process is demonstrated. The principal concern is not the presentation of new data; rather it is placement of existing psychological data within a new context. Topics discussed include: attention as an…
From Quick Start Teams to Home Teams: The Duke TQM Experience.
ERIC Educational Resources Information Center
Lubans, John; Gordon, Heather
This paper describes the Duke University Libraries' transition in early 1994 from its traditional hierarchical model to an organization emphasizing Total Quality Management (TQM) concepts such as self-managing teams and continuous improvement. Existing conditions at the libraries that played a role in the decision to switch included: (1) rising…
ERIC Educational Resources Information Center
He, Jinxia
2009-01-01
This study examined factors that might impact student knowledge sharing within virtual teams through online discussion boards. These factors included: trust, mutual influence, conflict, leadership, and cohesion. A path model was developed to determine whether relationships exist among knowledge sharing from asynchronous group discussion and the…
Studies are currently underway to help fill knowledge gaps that exist in the general understanding of nitrification episodes. One of these gaps includes the need for growth and inactivation kinetic parameters for nitrifiers representative of those inhabiting distribution systems ...
Heritage Education in the School Curriculum.
ERIC Educational Resources Information Center
Patrick, John J.
There is no need to create a new curriculum in heritage education. Rather, there is an imperative to use the existing curriculum more effectively, to infuse it with the best content on U.S. history and culture, including models of the built environment that embody and reflect the values, aspirations, and achievements of preceding generations.…
Perceptions of Peer Sexual Behavior: Do Adolescents Believe in a Sexual Double Standard?
ERIC Educational Resources Information Center
Young, Michael; Cardenas, Susan; Donnelly, Joseph; Kittleson, Mark J.
2016-01-01
Background: The purpose of the study was to (1) examine attitudes of adolescents toward peer models having sex or choosing abstinence, and (2) determine whether a "double standard" in perception existed concerning adolescent abstinence and sexual behavior. Methods: Adolescents (N = 173) completed questionnaires that included 1 of 6…
Effects of chemicals and pathway inhibitors on a human in vitro model of secondary palatal fusion.
The mechanisms of tissue and organ formation during embryonic development are unique, but many tissues like the iris, urethra, heart, neural tube, and palate rely upon common cellular and tissue events including tissue fusion. Few human in vitro assays exist to study human embryo...
USDA-ARS?s Scientific Manuscript database
Current data does not exist sufficient for predicting thermal inactivation kinetics of Salmonella spp. for many types of liquid egg products, including salted liquid whole egg, for use in updating pasteurization guidelines. This is, in part, due to variations in Salmonella strains and changes in th...
Music Teacher Perceptions of a Model of Technology Training and Support in Virginia
ERIC Educational Resources Information Center
Welch, Lee Arthur
2013-01-01
A plethora of technology resources currently exists for the music classroom of the twenty-first century, including digital audio and video, music software, electronic instruments, Web 2.0 tools and more. Research shows a strong need for professional development for teachers to properly implement and integrate instructional technology resources…
2005-07-01
serum INS, IGF-I and binding proteins, triglycerides, HDL - cholesterol , total and free steroids, sex hormone binding globulin, adiponectin, leptin, and...collection of information is estimated to average 1 hour per response , including the time for reviewing instructions, searching existing data sources...Bioinformatics, Biostatistics, Computer Science, Digital Mammography, Magnetic Resonance Imaging, Tissue Arrays, Gene Polymorphisms , Animal Models, Clinical
Lessons Learned from the Whole Child and Coordinated School Health Approaches
ERIC Educational Resources Information Center
Rasberry, Catherine N.; Slade, Sean; Lohrmann, David K.; Valois, Robert F.
2015-01-01
Background: The new Whole School, Whole Community, Whole Child (WSCC) model, designed to depict links between health and learning, is founded on concepts of coordinated school health (CSH) and a whole child approach to education. Methods: The existing literature, including scientific articles and key publications from national agencies and…
Analysis instruments for the performance of Advanced Practice Nursing.
Sevilla-Guerra, Sonia; Zabalegui, Adelaida
2017-11-29
Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
2013-01-01
Background Mass distribution of long-lasting insecticide treated bed nets (LLINs) has led to large increases in LLIN coverage in many African countries. As LLIN ownership levels increase, planners of future mass distributions face the challenge of deciding whether to ignore the nets already owned by households or to take these into account and attempt to target individuals or households without nets. Taking existing nets into account would reduce commodity costs but require more sophisticated, and potentially more costly, distribution procedures. The decision may also have implications for the average age of nets in use and therefore on the maintenance of universal LLIN coverage over time. Methods A stochastic simulation model based on the NetCALC algorithm was used to determine the scenarios under which it would be cost saving to take existing nets into account, and the potential effects of doing so on the age profile of LLINs owned. The model accounted for variability in timing of distributions, concomitant use of continuous distribution systems, population growth, sampling error in pre-campaign coverage surveys, variable net ‘decay’ parameters and other factors including the feasibility and accuracy of identifying existing nets in the field. Results Results indicate that (i) where pre-campaign coverage is around 40% (of households owning at least 1 LLIN), accounting for existing nets in the campaign will have little effect on the mean age of the net population and (ii) even at pre-campaign coverage levels above 40%, an approach that reduces LLIN distribution requirements by taking existing nets into account may have only a small chance of being cost-saving overall, depending largely on the feasibility of identifying nets in the field. Based on existing literature the epidemiological implications of such a strategy is likely to vary by transmission setting, and the risks of leaving older nets in the field when accounting for existing nets must be considered. Conclusions Where pre-campaign coverage levels established by a household survey are below 40% we recommend that planners do not take such LLINs into account and instead plan a blanket mass distribution. At pre-campaign coverage levels above 40%, campaign planners should make explicit consideration of the cost and feasibility of accounting for existing LLINs before planning blanket mass distributions. Planners should also consider restricting the coverage estimates used for this decision to only include nets under two years of age in order to ensure that old and damaged nets do not compose too large a fraction of existing net coverage. PMID:23763773
Yukich, Joshua; Bennett, Adam; Keating, Joseph; Yukich, Rudy K; Lynch, Matt; Eisele, Thomas P; Kolaczinski, Kate
2013-06-14
Mass distribution of long-lasting insecticide treated bed nets (LLINs) has led to large increases in LLIN coverage in many African countries. As LLIN ownership levels increase, planners of future mass distributions face the challenge of deciding whether to ignore the nets already owned by households or to take these into account and attempt to target individuals or households without nets. Taking existing nets into account would reduce commodity costs but require more sophisticated, and potentially more costly, distribution procedures. The decision may also have implications for the average age of nets in use and therefore on the maintenance of universal LLIN coverage over time. A stochastic simulation model based on the NetCALC algorithm was used to determine the scenarios under which it would be cost saving to take existing nets into account, and the potential effects of doing so on the age profile of LLINs owned. The model accounted for variability in timing of distributions, concomitant use of continuous distribution systems, population growth, sampling error in pre-campaign coverage surveys, variable net 'decay' parameters and other factors including the feasibility and accuracy of identifying existing nets in the field. Results indicate that (i) where pre-campaign coverage is around 40% (of households owning at least 1 LLIN), accounting for existing nets in the campaign will have little effect on the mean age of the net population and (ii) even at pre-campaign coverage levels above 40%, an approach that reduces LLIN distribution requirements by taking existing nets into account may have only a small chance of being cost-saving overall, depending largely on the feasibility of identifying nets in the field. Based on existing literature the epidemiological implications of such a strategy is likely to vary by transmission setting, and the risks of leaving older nets in the field when accounting for existing nets must be considered. Where pre-campaign coverage levels established by a household survey are below 40% we recommend that planners do not take such LLINs into account and instead plan a blanket mass distribution. At pre-campaign coverage levels above 40%, campaign planners should make explicit consideration of the cost and feasibility of accounting for existing LLINs before planning blanket mass distributions. Planners should also consider restricting the coverage estimates used for this decision to only include nets under two years of age in order to ensure that old and damaged nets do not compose too large a fraction of existing net coverage.
Christensen, Jette; El Allaki, Farouk; Vallières, André
2014-05-01
Scenario tree models with temporal discounting have been applied in four continents to support claims of freedom from animal disease. Recently, a second (new) model was developed for the same population and disease. This is a natural development because surveillance is a dynamic process that needs to adapt to changing circumstances - the difficulty is the justification for, documentation of, presentation of and the acceptance of the changes. Our objective was to propose a systematic approach to present changes to an existing scenario tree model for freedom from disease. We used the example of how we adapted the deterministic Canadian Notifiable Avian Influenza scenario tree model published in 2011 to a stochastic scenario tree model where the definition of sub-populations and the estimation of probability of introduction of the pathogen were modified. We found that the standardized approach by Vanderstichel et al. (2013) with modifications provided a systematic approach to make and present changes to an existing scenario tree model. We believe that the new 2013 CanNAISS scenario tree model is a better model than the 2011 model because the 2013 model included more surveillance data. In particular, the new data on Notifiable Avian Influenza in Canada from the last 5 years were used to improve input parameters and model structure. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Geist; Dauble
1998-09-01
/ Knowledge of the three-dimensional connectivity between rivers and groundwater within the hyporheic zone can be used to improve the definition of fall chinook salmon (Oncorhynchus tshawytscha) spawning habitat. Information exists on the microhabitat characteristics that define suitable salmon spawning habitat. However, traditional spawning habitat models that use these characteristics to predict available spawning habitat are restricted because they can not account for the heterogeneous nature of rivers. We present a conceptual spawning habitat model for fall chinook salmon that describes how geomorphic features of river channels create hydraulic processes, including hyporheic flows, that influence where salmon spawn in unconstrained reaches of large mainstem alluvial rivers. Two case studies based on empirical data from fall chinook salmon spawning areas in the Hanford Reach of the Columbia River are presented to illustrate important aspects of our conceptual model. We suggest that traditional habitat models and our conceptual model be combined to predict the limits of suitable fall chinook salmon spawning habitat. This approach can incorporate quantitative measures of river channel morphology, including general descriptors of geomorphic features at different spatial scales, in order to understand the processes influencing redd site selection and spawning habitat use. This information is needed in order to protect existing salmon spawning habitat in large rivers, as well as to recover habitat already lost.KEY WORDS: Hyporheic zone; Geomorphology; Spawning habitat; Large rivers; Fall chinook salmon; Habitat management
Integrated hydrologic modeling of a transboundary aquifer system —Lower Rio Grande
Hanson, Randall T.; Schmid, Wolfgang; Knight, Jacob E.; Maddock, Thomas
2013-01-01
For more than 30 years the agreements developed for the aquifer systems of the lower Rio Grande and related river compacts of the Rio Grande River have evolved into a complex setting of transboundary conjunctive use. The conjunctive use now includes many facets of water rights, water use, and emerging demands between the states of New Mexico and Texas, the United States and Mexico, and various water-supply agencies. The analysis of the complex relations between irrigation and streamflow supplyand-demand components and the effects of surface-water and groundwater use requires an integrated hydrologic model to track all of the use and movement of water. MODFLOW with the Farm Process (MFFMP) provides the integrated approach needed to assess the stream-aquifer interactions that are dynamically affected by irrigation demands on streamflow allotments that are supplemented with groundwater pumpage. As a first step to the ongoing full implementation of MF-FMP by the USGS, the existing model (LRG_2007) was modified to include some FMP features, demonstrating the ability to simulate the existing streamflow-diversion relations known as the D2 and D3 curves, departure of downstream deliveries from these curves during low allocation years and with increasing efficiency upstream, and the dynamic relation between surface-water conveyance and estimates of pumpage and recharge. This new MF-FMP modeling framework can now internally analyze complex relations within the Lower Rio Grande Hydrologic Model (LRGHM_2011) that previous techniques had limited ability to assess.
The Untapped Potential of Patient and Family Engagement in the Organization of Critical Care.
Haines, Kimberley J; Kelly, Phillipa; Fitzgerald, Peter; Skinner, Elizabeth H; Iwashyna, Theodore J
2017-05-01
There is growing interest in patient and family participation in critical care-not just as part of the bedside, but as part of educational and management organization and infrastructure. This offers tremendous opportunities for change but carries risk to patients, families, and the institution. The objective is to provide a concise definitive review of patient and family organizational participation in critical care as a high-risk population and other vulnerable groups. A pragmatic, codesigned model for critical care is offered as a suggested approach for clinicians, researchers, and policy-makers. To inform this review, a systematic search of Ovid Medline, PubMed, and Embase was undertaken in April 2016 using the MeSH terms: patient participation and critical care. A second search was undertaken in PubMed using the terms: patient participation and organizational models to search for other examples of engagement in vulnerable populations. We explicitly did not seek to include discussions of bedside patient-family engagement or shared decision-making. Two reviewers screened citations independently. Included studies either actively partnered with patients and families or described a model of engagement in critical care and other vulnerable populations. Data or description of how patient and family engagement occurred and/or description of model were extracted into a standardized form. There was limited evidence of patient and family engagement in critical care although key recommendations can be drawn from included studies. Patient and family engagement is occurring in other vulnerable populations although there are few described models and none which address issues of risk. A model of patient and family engagement in critical care does not exist, and we propose a pragmatic, codesigned model that takes into account issues of psychologic safety in this population. Significant opportunity exists to document processes of engagement that reflect a changing paradigm of healthcare delivery.
NASA Astrophysics Data System (ADS)
Chen, Xiaojie
2015-09-01
The puzzle of cooperation exists widely in the realistic world, including biological, social, and engineering systems. How to solve the cooperation puzzle has received considerable attention in recent years [1]. Evolutionary game theory provides a common mathematical framework to study the problem of cooperation. In principle, these practical biological, social, or engineering systems can be described by complex game models composed of multiple autonomous individuals with mutual interactions. And generally there exists a dilemma for the evolution of cooperation in the game systems.
A small, single stage orifice pulse tube cryocooler demonstration
NASA Technical Reports Server (NTRS)
Hendricks, John B.
1990-01-01
This final report summarizes and presents the analytical and experimental progress in the present effort. The principal objective of this effort was the demonstration of a 0.25 Watt, 80 Kelvin orifice pulse tube refrigerator. The experimental apparatus is described. The design of a partially optimized pulse tube refrigerator is included. The refrigerator demonstrates an ultimate temperature of 77 K, has a projected cooling power of 0.18 Watts at 80 K, and has a measured cooling power of 1 Watt at 97 K, with an electrical efficiency of 250 Watts/Watt, much better than previous pulse tube refrigerators. A model of the pulse tube refrigerator that provides estimates of pressure ratio and mass flow within the pulse tube refrigerator, based on component physical characteristics is included. A model of a pulse tube operation based on generalized analysis which is adequate to support local optimization of existing designs is included. A model of regenerator performance based on an analogy to counterflow heat exchangers is included.
Design and Analysis Tools for Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.; Folk, Thomas C.
2009-01-01
Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.
Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P
2010-06-01
The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.
Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models
NASA Technical Reports Server (NTRS)
Terazawa, Hidezumi
1996-01-01
The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Assessing the feasibility, cost, and utility of developing models of human performance in aviation
NASA Technical Reports Server (NTRS)
Stillwell, William
1990-01-01
The purpose of the effort outlined in this briefing was to determine whether models exist or can be developed that can be used to address aviation automation issues. A multidisciplinary team has been assembled to undertake this effort, including experts in human performance, team/crew, and aviation system modeling, and aviation data used as input to such models. The project consists of two phases, a requirements assessment phase that is designed to determine the feasibility and utility of alternative modeling efforts, and a model development and evaluation phase that will seek to implement the plan (if a feasible cost effective development effort is found) that results from the first phase. Viewgraphs are given.
The basis function approach for modeling autocorrelation in ecological data
Hefley, Trevor J.; Broms, Kristin M.; Brost, Brian M.; Buderman, Frances E.; Kay, Shannon L.; Scharf, Henry; Tipton, John; Williams, Perry J.; Hooten, Mevin B.
2017-01-01
Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data.
Brightness perception of unrelated self-luminous colors.
Withouck, Martijn; Smet, Kevin A G; Ryckaert, Wouter R; Pointer, Michael R; Deconinck, Geert; Koenderink, Jan; Hanselaer, Peter
2013-06-01
The perception of brightness of unrelated self-luminous colored stimuli of the same luminance has been investigated. The Helmholtz-Kohlrausch (H-K) effect, i.e., an increase in brightness perception due to an increase in saturation, is clearly observed. This brightness perception is compared with the calculated brightness according to six existing vision models, color appearance models, and models based on the concept of equivalent luminance. Although these models included the H-K effect and half of them were developed to work with unrelated colors, none of the models seemed to be able to fully predict the perceived brightness. A tentative solution to increase the prediction accuracy of the color appearance model CAM97u, developed by Hunt, is presented.
Advanced space system analysis software. Technical, user, and programmer guide
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Zimbelman, H. F.
1981-01-01
The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.
Cell-type-specific modelling of intracellular calcium signalling: a urothelial cell model.
Appleby, Peter A; Shabir, Saqib; Southgate, Jennifer; Walker, Dawn
2013-09-06
Calcium signalling plays a central role in regulating a wide variety of cell processes. A number of calcium signalling models exist in the literature that are capable of reproducing a variety of experimentally observed calcium transients. These models have been used to examine in more detail the mechanisms underlying calcium transients, but very rarely has a model been directly linked to a particular cell type and experimentally verified. It is important to show that this can be achieved within the general theoretical framework adopted by these models. Here, we develop a framework designed specifically for modelling cytosolic calcium transients in urothelial cells. Where possible, we draw upon existing calcium signalling models, integrating descriptions of components known to be important in this cell type from a number of studies in the literature. We then add descriptions of several additional pathways that play a specific role in urothelial cell signalling, including an explicit ionic influx term and an active pumping mechanism that drives the cytosolic calcium concentration to a target equilibrium. The resulting one-pool model of endoplasmic reticulum (ER)-dependent calcium signalling relates the cytosolic, extracellular and ER calcium concentrations and can generate a wide range of calcium transients, including spikes, bursts, oscillations and sustained elevations in the cytosolic calcium concentration. Using single-variate robustness and multivariate sensitivity analyses, we quantify how varying each of the parameters of the model leads to changes in key features of the calcium transient, such as initial peak amplitude and the frequency of bursting or spiking, and in the transitions between bursting- and plateau-dominated modes. We also show that, novel to our urothelial cell model, the ionic and purinergic P2Y pathways make distinct contributions to the calcium transient. We then validate the model using human bladder epithelial cells grown in monolayer cell culture and show that the model robustly captures the key features of the experimental data in a way that is not possible using more generic calcium models from the literature.
Representing agriculture in Earth System Models: Approaches and priorities for development
NASA Astrophysics Data System (ADS)
McDermid, S. S.; Mearns, L. O.; Ruane, A. C.
2017-09-01
Earth System Model (ESM) advances now enable improved representations of spatially and temporally varying anthropogenic climate forcings. One critical forcing is global agriculture, which is now extensive in land-use and intensive in management, owing to 20th century development trends. Agriculture and food systems now contribute nearly 30% of global greenhouse gas emissions and require copious inputs and resources, such as fertilizer, water, and land. Much uncertainty remains in quantifying important agriculture-climate interactions, including surface moisture and energy balances and biogeochemical cycling. Despite these externalities and uncertainties, agriculture is increasingly being leveraged to function as a net sink of anthropogenic carbon, and there is much emphasis on future sustainable intensification. Given its significance as a major environmental and climate forcing, there now exist a variety of approaches to represent agriculture in ESMs. These approaches are reviewed herein, and range from idealized representations of agricultural extent to the development of coupled climate-crop models that capture dynamic feedbacks. We highlight the robust agriculture-climate interactions and responses identified by these modeling efforts, as well as existing uncertainties and model limitations. To this end, coordinated and benchmarking assessments of land-use-climate feedbacks can be leveraged for further improvements in ESM's agricultural representations. We suggest key areas for continued model development, including incorporating irrigation and biogeochemical cycling in particular. Last, we pose several critical research questions to guide future work. Our review focuses on ESM representations of climate-surface interactions over managed agricultural lands, rather than on ESMs as an estimation tool for crop yields and productivity.
Radiative transfer in CO2-rich atmospheres: 1. Collisional line mixing implies a colder early Mars
NASA Astrophysics Data System (ADS)
Ozak, N.; Aharonson, O.; Halevy, I.
2016-06-01
Fast and accurate radiative transfer methods are essential for modeling CO2-rich atmospheres, relevant to the climate of early Earth and Mars, present-day Venus, and some exoplanets. Although such models already exist, their accuracy may be improved as better theoretical and experimental constraints become available. Here we develop a unidimensional radiative transfer code for CO2-rich atmospheres, using the correlated k approach and with a focus on modeling early Mars. Our model differs from existing models in that it includes the effects of CO2 collisional line mixing in the calculation of the line-by-line absorption coefficients. Inclusion of these effects results in model atmospheres that are more transparent to infrared radiation and, therefore, in colder surface temperatures at radiative-convective equilibrium, compared with results of previous studies. Inclusion of water vapor in the model atmosphere results in negligible warming due to the low atmospheric temperatures under a weaker early Sun, which translate into climatically unimportant concentrations of water vapor. Overall, the results imply that sustained warmth on early Mars would not have been possible with an atmosphere containing only CO2 and water vapor, suggesting that other components of the early Martian climate system are missing from current models or that warm conditions were not long lived.
Novel approach for dam break flow modeling using computational intelligence
NASA Astrophysics Data System (ADS)
Seyedashraf, Omid; Mehrabi, Mohammad; Akhtari, Ali Akbar
2018-04-01
A new methodology based on the computational intelligence (CI) system is proposed and tested for modeling the classic 1D dam-break flow problem. The reason to seek for a new solution lies in the shortcomings of the existing analytical and numerical models. This includes the difficulty of using the exact solutions and the unwanted fluctuations, which arise in the numerical results. In this research, the application of the radial-basis-function (RBF) and multi-layer-perceptron (MLP) systems is detailed for the solution of twenty-nine dam-break scenarios. The models are developed using seven variables, i.e. the length of the channel, the depths of the up-and downstream sections, time, and distance as the inputs. Moreover, the depths and velocities of each computational node in the flow domain are considered as the model outputs. The models are validated against the analytical, and Lax-Wendroff and MacCormack FDM schemes. The findings indicate that the employed CI models are able to replicate the overall shape of the shock- and rarefaction-waves. Furthermore, the MLP system outperforms RBF and the tested numerical schemes. A new monolithic equation is proposed based on the best fitting model, which can be used as an efficient alternative to the existing piecewise analytic equations.
Pechak, Celia M; Black, Jill D
2014-02-01
Increasingly physical therapist students complete part of their clinical training outside of their home country. This trend is understudied. The purposes of this study were to: (1) explore, in depth, various international clinical education (ICE) programs; and (2) determine whether the Conceptual Model of Optimal International Service-Learning (ISL) could be applied or adapted to represent ICE. Qualitative content analysis was used to analyze ICE programs and consider modification of an existing ISL conceptual model for ICE. Fifteen faculty in the United States currently involved in ICE were interviewed. The interview transcriptions were systematically analyzed by two researchers. Three models of ICE practices emerged: (1) a traditional clinical education model where local clinical instructors (CIs) focus on the development of clinical skills; (2) a global health model where US-based CIs provide the supervision in the international setting, and learning outcomes emphasized global health and cultural competency; and (3) an ICE/ISL hybrid where US-based CIs supervise the students, and the foci includes community service. Additionally the data supported revising the ISL model's essential core conditions, components and consequence for ICE. The ICE conceptual model may provide a useful framework for future ICE program development and research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau
2011-12-01
The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less
A single factor underlies the metabolic syndrome: a confirmatory factor analysis.
Pladevall, Manel; Singal, Bonita; Williams, L Keoki; Brotons, Carlos; Guyer, Heidi; Sadurni, Josep; Falces, Carles; Serrano-Rios, Manuel; Gabriel, Rafael; Shaw, Jonathan E; Zimmet, Paul Z; Haffner, Steven
2006-01-01
Confirmatory factor analysis (CFA) was used to test the hypothesis that the components of the metabolic syndrome are manifestations of a single common factor. Three different datasets were used to test and validate the model. The Spanish and Mauritian studies included 207 men and 203 women and 1,411 men and 1,650 women, respectively. A third analytical dataset including 847 men was obtained from a previously published CFA of a U.S. population. The one-factor model included the metabolic syndrome core components (central obesity, insulin resistance, blood pressure, and lipid measurements). We also tested an expanded one-factor model that included uric acid and leptin levels. Finally, we used CFA to compare the goodness of fit of one-factor models with the fit of two previously published four-factor models. The simplest one-factor model showed the best goodness-of-fit indexes (comparative fit index 1, root mean-square error of approximation 0.00). Comparisons of one-factor with four-factor models in the three datasets favored the one-factor model structure. The selection of variables to represent the different metabolic syndrome components and model specification explained why previous exploratory and confirmatory factor analysis, respectively, failed to identify a single factor for the metabolic syndrome. These analyses support the current clinical definition of the metabolic syndrome, as well as the existence of a single factor that links all of the core components.
NASA Technical Reports Server (NTRS)
Chiu, Hong-Yee
1990-01-01
The theory of Lee and Pang (1987), who obtained solutions for soliton stars composed of zero-temperature fermions and bosons, is applied here to quark soliton stars. Model soliton stars based on a simple physical model of the proton are computed, and the properties of the solitons are discussed, including the important problem of the existence of a limiting mass and thus the possible formation of black holes of primordial origin. It is shown that there is a definite mass limit for ponderable soliton stars, so that during cooling a soliton star might reach a stage beyond which no equilibrium configuration exists and the soliton star probably will collapse to become a black hole. The radiation of ponderable soliton stars may alter the short-wavelength character of the cosmic background radiation, and may be observed as highly redshifted objects at z of about 100,000.
Families of Graph Algorithms: SSSP Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanewala Appuhamilage, Thejaka Amila Jay; Zalewski, Marcin J.; Lumsdaine, Andrew
2017-08-28
Single-Source Shortest Paths (SSSP) is a well-studied graph problem. Examples of SSSP algorithms include the original Dijkstra’s algorithm and the parallel Δ-stepping and KLA-SSSP algorithms. In this paper, we use a novel Abstract Graph Machine (AGM) model to show that all these algorithms share a common logic and differ from one another by the order in which they perform work. We use the AGM model to thoroughly analyze the family of algorithms that arises from the common logic. We start with the basic algorithm without any ordering (Chaotic), and then we derive the existing and new algorithms by methodically exploringmore » semantic and spatial ordering of work. Our experimental results show that new derived algorithms show better performance than the existing distributed memory parallel algorithms, especially at higher scales.« less
Modelling of the Thermo-Physical and Physical Properties for Solidification of Al-Alloys
NASA Astrophysics Data System (ADS)
Saunders, N.; Li, X.; Miodownik, A. P.; Schillé, J.-P.
The thermo-physical and physical properties of the liquid and solid phases are critical components in casting simulations. Such properties include the fraction solid transformed, enthalpy release, thermal conductivity, volume and density, all as a function of temperature. Due to the difficulty in experimentally determining such properties at solidification temperatures, little information exists for multi-component alloys. As part of the development of a new computer program for modelling of materials properties (JMatPro) extensive work has been carried out on the development of sound, physically based models for these properties. Wide ranging results will presented for Al-based alloys, which will include more detailed information concerning the density change of the liquid that intrinsically occurs during solidification due to its change in composition.
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Tripodi, Marina; Siano, Maria Anna; Mandato, Claudia; De Anseris, Anna Giulia Elena; Quitadamo, Paolo; Guercio Nuzio, Salvatore; Viggiano, Claudia; Fasolino, Francesco; Bellopede, Annalisa; Annunziata, Maria; Massa, Grazia; Pepe, Francesco Maria; De Chiara, Maria; Siani, Paolo; Vajro, Pietro
2017-08-30
The term "humanization" indicates the process by which people try to make something more human and civilized, more in line with what is believed to be the human nature. The humanization of care is an important and not yet a well-defined issue which includes a wide range of aspects related to the approach to the patient and care modalities. In pediatrics, the humanization concept is even vaguer due to the dual involvement of both the child and his/her family and by the existence of multiple proposed models. The present study aims to analyze the main existing humanization models regarding pediatric care, and the tools for assessing its grade. The main Humanization care programs have been elaborated and developed both in America (Brazil, USA) and Europe. The North American and European models specifically concern pediatric care, while the model developed in Brazil is part of a broader program aimed at all age groups. The first emphasis is on the importance of the family in child care, the second emphasis is on the child's right to be a leader, to be heard and to be able to express its opinion on the program's own care. Several tools have been created and used to evaluate humanization of care programs and related aspects. None, however, had been mutually compared. The major models of humanization care and the related assessment tools here reviewed highlight the urgent need for a more unifying approach, which may help in realizing health care programs closer to the young patient's and his/her family needs.
A novel model for estimating organic chemical bioconcentration in agricultural plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hung, H.; Mackay, D.; Di Guardo, A.
1995-12-31
There is increasing recognition that much human and wildlife exposure to organic contaminants can be traced through the food chain to bioconcentration in vegetation. For risk assessment, there is a need for an accurate model to predict organic chemical concentrations in plants. Existing models range from relatively simple correlations of concentrations using octanol-water or octanol-air partition coefficients, to complex models involving extensive physiological data. To satisfy the need for a relatively accurate model of intermediate complexity, a novel approach has been devised to predict organic chemical concentrations in agricultural plants as a function of soil and air concentrations, without themore » need for extensive plant physiological data. The plant is treated as three compartments, namely, leaves, roots and stems (including fruit and seeds). Data readily available from the literature, including chemical properties, volume, density and composition of each compartment; metabolic and growth rate of plant; and readily obtainable environmental conditions at the site are required as input. Results calculated from the model are compared with observed and experimentally-determined concentrations. It is suggested that the model, which includes a physiological database for agricultural plants, gives acceptably accurate predictions of chemical partitioning between plants, air and soil.« less
Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.
2016-12-01
Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.
The Role of Inertia in Modeling Decisions from Experience with Instance-Based Learning
Dutt, Varun; Gonzalez, Cleotilde
2012-01-01
One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model. PMID:22685443
The role of inertia in modeling decisions from experience with instance-based learning.
Dutt, Varun; Gonzalez, Cleotilde
2012-01-01
One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model.
Regional-scale calculation of the LS factor using parallel processing
NASA Astrophysics Data System (ADS)
Liu, Kai; Tang, Guoan; Jiang, Ling; Zhu, A.-Xing; Yang, Jianyi; Song, Xiaodong
2015-05-01
With the increase of data resolution and the increasing application of USLE over large areas, the existing serial implementation of algorithms for computing the LS factor is becoming a bottleneck. In this paper, a parallel processing model based on message passing interface (MPI) is presented for the calculation of the LS factor, so that massive datasets at a regional scale can be processed efficiently. The parallel model contains algorithms for calculating flow direction, flow accumulation, drainage network, slope, slope length and the LS factor. According to the existence of data dependence, the algorithms are divided into local algorithms and global algorithms. Parallel strategy are designed according to the algorithm characters including the decomposition method for maintaining the integrity of the results, optimized workflow for reducing the time taken for exporting the unnecessary intermediate data and a buffer-communication-computation strategy for improving the communication efficiency. Experiments on a multi-node system show that the proposed parallel model allows efficient calculation of the LS factor at a regional scale with a massive dataset.
Omics analysis of mouse brain models of human diseases.
Paban, Véronique; Loriod, Béatrice; Villard, Claude; Buee, Luc; Blum, David; Pietropaolo, Susanna; Cho, Yoon H; Gory-Faure, Sylvie; Mansour, Elodie; Gharbi, Ali; Alescio-Lautier, Béatrice
2017-02-05
The identification of common gene/protein profiles related to brain alterations, if they exist, may indicate the convergence of the pathogenic mechanisms driving brain disorders. Six genetically engineered mouse lines modelling neurodegenerative diseases and neuropsychiatric disorders were considered. Omics approaches, including transcriptomic and proteomic methods, were used. The gene/protein lists were used for inter-disease comparisons and further functional and network investigations. When the inter-disease comparison was performed using the gene symbol identifiers, the number of genes/proteins involved in multiple diseases decreased rapidly. Thus, no genes/proteins were shared by all 6 mouse models. Only one gene/protein (Gfap) was shared among 4 disorders, providing strong evidence that a common molecular signature does not exist among brain diseases. The inter-disease comparison of functional processes showed the involvement of a few major biological processes indicating that brain diseases of diverse aetiologies might utilize common biological pathways in the nervous system, without necessarily involving similar molecules. Copyright © 2016 Elsevier B.V. All rights reserved.
Origin of asteroids and the missing planet
NASA Technical Reports Server (NTRS)
Opik, E. J.
1977-01-01
Consideration is given to Ovenden's (1972) theory concerning the existence of a planet of 90 earth masses which existed from the beginning of the solar system and then disappeared 16 million years ago, leaving only asteroids. His model for secular perturbations is reviewed along with the principle of least interaction action (1972, 1973, 1975) on which the model is based. It is suggested that the structure of the asteroid belt and the origin of meteorites are associated with the vanished planet. A figure of 0.001 earth masses is proposed as a close estimate of the mass of the asteroidal belt. The hypothesis that the planet was removed through an explosion is discussed, noting the possible origin of asteroids in such a manner. Various effects of the explosion are postulated, including the direct impact of fragments on the earth, their impact on the sun and its decreased radiation, and the direct radiation of the explosion. A model for the disappearance of the planet by ejection in a gravitational encounter with a passing mass is also described.
DebriSat: The New Hypervelocity Impact Test for Satellite Breakup Fragment Characterization
NASA Technical Reports Server (NTRS)
Cowardin, Heather
2015-01-01
To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models: DebriSat is intended to be representative of modern LEO satellites. Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. center dotA key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992. Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.
Linde, Niklas; Ricci, Tullio; Baron, Ludovic; Shakas, Alexis; Berrino, Giovanna
2017-08-16
Existing 3-D density models of the Somma-Vesuvius volcanic complex (SVVC), Italy, largely disagree. Despite the scientific and socioeconomic importance of Vesuvius, there is no reliable 3-D density model of the SVVC. A considerable uncertainty prevails concerning the presence (or absence) of a dense body underlying the Vesuvius crater (1944 eruption) that is implied from extensive seismic investigations. We have acquired relative gravity measurements at 297 stations, including measurements in difficult-to-access areas (e.g., the first-ever measurements in the crater). In agreement with seismic investigations, the simultaneous inversion of these and historic data resolves a high-density body that extends from the surface of the Vesuvius crater down to depths that exceed 2 km. A 1.5-km radius horseshoe-shaped dense feature (open in the southwestern sector) enforces the existing model of groundwater circulation within the SVVC. Based on its volcano-tectonic evolution, we interpret volcanic structures that have never been imaged before.
Basal and thermal control mechanisms of the Ragnhild glaciers, East Antarctica
NASA Astrophysics Data System (ADS)
Pattyn, Frank; de Brabander, Sang; Huyghe, Ann
The Ragnhild glaciers are three enhanced-flow features situated between the Sør Rondane and Yamato Mountains in eastern Dronning Maud Land, Antarctica. We investigate the glaciological mechanisms controlling their existence and behavior, using a three-dimensional numerical thermomechanical ice-sheet model including higher-order stress gradients. This model is further extended with a steady-state model of subglacial water flow, based on the hydraulic potential gradient. Both static and dynamic simulations are capable of reproducing the enhanced ice-flow features. Although basal topography is responsible for the existence of the flow pattern, thermomechanical effects and basal sliding seem to locally soften and lubricate the ice in the main trunks. Lateral drag is a contributing factor in balancing the driving stress, as shear margins can be traced over a distance of hundreds of kilometers along west Ragnhild glacier. Different basal sliding scenarios show that central Ragnhild glacier stagnates as west Ragnhild glacier accelerates and progressively drains the whole catchment area by ice and water piracy.
Artificial neural networks in mammography interpretation and diagnostic decision making.
Ayer, Turgay; Chen, Qiushi; Burnside, Elizabeth S
2013-01-01
Screening mammography is the most effective means for early detection of breast cancer. Although general rules for discriminating malignant and benign lesions exist, radiologists are unable to perfectly detect and classify all lesions as malignant and benign, for many reasons which include, but are not limited to, overlap of features that distinguish malignancy, difficulty in estimating disease risk, and variability in recommended management. When predictive variables are numerous and interact, ad hoc decision making strategies based on experience and memory may lead to systematic errors and variability in practice. The integration of computer models to help radiologists increase the accuracy of mammography examinations in diagnostic decision making has gained increasing attention in the last two decades. In this study, we provide an overview of one of the most commonly used models, artificial neural networks (ANNs), in mammography interpretation and diagnostic decision making and discuss important features in mammography interpretation. We conclude by discussing several common limitations of existing research on ANN-based detection and diagnostic models and provide possible future research directions.
Hulvershorn, Leslie A; Quinn, Patrick D; Scott, Eric L
2015-01-01
The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents.
Hulvershorn, Leslie A.; Quinn, Patrick D.; Scott, Eric L.
2016-01-01
Background The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. Method We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Results Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. Conclusions The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents. PMID:25973718
Horne, Avril C; Szemis, Joanna M; Webb, J Angus; Kaur, Simranjit; Stewardson, Michael J; Bond, Nick; Nathan, Rory
2018-03-01
One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.
Leveraging constraints and biotelemetry data to pinpoint repetitively used spatial features
Brost, Brian M.; Hooten, Mevin B.; Small, Robert J.
2016-01-01
Satellite telemetry devices collect valuable information concerning the sites visited by animals, including the location of central places like dens, nests, rookeries, or haul‐outs. Existing methods for estimating the location of central places from telemetry data require user‐specified thresholds and ignore common nuances like measurement error. We present a fully model‐based approach for locating central places from telemetry data that accounts for multiple sources of uncertainty and uses all of the available locational data. Our general framework consists of an observation model to account for large telemetry measurement error and animal movement, and a highly flexible mixture model specified using a Dirichlet process to identify the location of central places. We also quantify temporal patterns in central place use by incorporating ancillary behavioral data into the model; however, our framework is also suitable when no such behavioral data exist. We apply the model to a simulated data set as proof of concept. We then illustrate our framework by analyzing an Argos satellite telemetry data set on harbor seals (Phoca vitulina) in the Gulf of Alaska, a species that exhibits fidelity to terrestrial haul‐out sites.
NASA Astrophysics Data System (ADS)
Horne, Avril C.; Szemis, Joanna M.; Webb, J. Angus; Kaur, Simranjit; Stewardson, Michael J.; Bond, Nick; Nathan, Rory
2018-03-01
One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.
Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science
NASA Astrophysics Data System (ADS)
Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín
2016-10-01
There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.
A Computational Model for Predicting Gas Breakdown
NASA Astrophysics Data System (ADS)
Gill, Zachary
2017-10-01
Pulsed-inductive discharges are a common method of producing a plasma. They provide a mechanism for quickly and efficiently generating a large volume of plasma for rapid use and are seen in applications including propulsion, fusion power, and high-power lasers. However, some common designs see a delayed response time due to the plasma forming when the magnitude of the magnetic field in the thruster is at a minimum. New designs are difficult to evaluate due to the amount of time needed to construct a new geometry and the high monetary cost of changing the power generation circuit. To more quickly evaluate new designs and better understand the shortcomings of existing designs, a computational model is developed. This model uses a modified single-electron model as the basis for a Mathematica code to determine how the energy distribution in a system changes with regards to time and location. By analyzing this energy distribution, the approximate time and location of initial plasma breakdown can be predicted. The results from this code are then compared to existing data to show its validity and shortcomings. Missouri S&T APLab.
The role of simulation in neurosurgery.
Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R
2016-01-01
In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.
Characterisation and modelling of washover fans
Donnelly, Chantal; Sallenger, Asbury H.
2007-01-01
Pre- and post-storm topography and aerial photography, collected in regions where new washover fans were formed, were studied to determine the extent of morphologic, vegetative and anthropogenic control on washover shape and extent. When overwash is funnelled through a gap in a dune ridge and then spreads laterally on the back barrier, decelerating and depositing sediment, it forms washover fans. Fans were shown to primarily occur at pre-existing gaps in the foredune. During overwash, these gaps, or overwash throats, widened and deepened. The shape and extent of the fan was shown to depend on not only the pre-storm topography, but also the existence of beach tracks, roads and other anthropogenic influences and vegetation. The cross-shore overwash profile change model by Larson et al. and Donnelly et al. was modified to include pre-storm throat widths and a lateral spreading angle estimated from the pre-storm topography as inputs and tested using cross-shore profiles through the fan centres. These new inputs make the model more generalised, such that the calibrated model is applicable to a wider range of cross-shore profiles.
Classification and unification of the microscopic deterministic traffic models.
Yang, Bo; Monterola, Christopher
2015-10-01
We identify a universal mathematical structure in microscopic deterministic traffic models (with identical drivers), and thus we show that all such existing models in the literature, including both the two-phase and three-phase models, can be understood as special cases of a master model by expansion around a set of well-defined ground states. This allows any two traffic models to be properly compared and identified. The three-phase models are characterized by the vanishing of leading orders of expansion within a certain density range, and as an example the popular intelligent driver model is shown to be equivalent to a generalized optimal velocity (OV) model. We also explore the diverse solutions of the generalized OV model that can be important both for understanding human driving behaviors and algorithms for autonomous driverless vehicles.
Automated web service composition supporting conditional branch structures
NASA Astrophysics Data System (ADS)
Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu
2014-01-01
The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.
System Reliability-Based Design Optimization Under Input and Model Uncertainties
2014-02-02
Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 W911NF-09- 1 -0250 319-335-5684 Final Report 56025-NS.40 a. REPORT 14. ABSTRACT 16...resource. In such cases the inaccuracy and uncertainty of the surrogate model needs to 1 . REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the
Binomial test statistics using Psi functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2007-01-01
For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.
NASA Technical Reports Server (NTRS)
Wilson, R. E.
1981-01-01
Aerodynamic developments for vertical axis and horizontal axis wind turbines are given that relate to the performance and aerodynamic loading of these machines. Included are: (1) a fixed wake aerodynamic model of the Darrieus vertical axis wind turbine; (2) experimental results that suggest the existence of a laminar flow Darrieus vertical axis turbine; (3) a simple aerodynamic model for the turbulent windmill/vortex ring state of horizontal axis rotors; and (4) a yawing moment of a rigid hub horizontal axis wind turbine that is related to blade coning.
Impacts of Freshwater on the Seasonal Variations of Surface Salinity in the Caspian Sea
2010-01-01
Counsel.Code 1008.3 ADOR/Director NCST E. R. Franchi , 7000 Public Affairs (Unclassified/ Unlimited Only). Code 7030 4 " 7-? o* c •> 1...component of a global ocean system. It is included neither in high resolution eddy resolving ocean models nor in existing operational models. Examples of...601153N as part of the NRL 6.1 Global Remote Littoral Forcing via Deep Water Pathways project. This is contribution NRL/JA/7320/08/8235 and has been
A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects
NASA Technical Reports Server (NTRS)
Trase, Kathryn; Fink, Eric
2014-01-01
Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information
Harrison, David A; Parry, Gareth J; Carpenter, James R; Short, Alasdair; Rowan, Kathy
2007-04-01
To develop a new model to improve risk prediction for admissions to adult critical care units in the UK. Prospective cohort study. The setting was 163 adult, general critical care units in England, Wales, and Northern Ireland, December 1995 to August 2003. Patients were 216,626 critical care admissions. None. The performance of different approaches to modeling physiologic measurements was evaluated, and the best methods were selected to produce a new physiology score. This physiology score was combined with other information relating to the critical care admission-age, diagnostic category, source of admission, and cardiopulmonary resuscitation before admission-to develop a risk prediction model. Modeling interactions between diagnostic category and physiology score enabled the inclusion of groups of admissions that are frequently excluded from risk prediction models. The new model showed good discrimination (mean c index 0.870) and fit (mean Shapiro's R 0.665, mean Brier's score 0.132) in 200 repeated validation samples and performed well when compared with recalibrated versions of existing published risk prediction models in the cohort of patients eligible for all models. The hypothesis of perfect fit was rejected for all models, including the Intensive Care National Audit & Research Centre (ICNARC) model, as is to be expected in such a large cohort. The ICNARC model demonstrated better discrimination and overall fit than existing risk prediction models, even following recalibration of these models. We recommend it be used to replace previously published models for risk adjustment in the UK.