CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling
NASA Astrophysics Data System (ADS)
Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.
2012-12-01
The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and their multiple threats and stressors, 5) a continental margin modeling initiative, to capture extreme oceanic and atmospheric events generating turbidity currents in the Gulf of Mexico, and 6) a CZO Focus Research Group, to develop compatibility between CSDMS architecture and protocols and Critical Zone Observatory-developed models and data.
Characteristics code for shock initiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Partom, Y.
1986-10-01
We developed SHIN, a characteristics code for shock initiation studies. We describe in detail the equations of state, reaction model, rate equations, and numerical difference equations that SHIN incorporates. SHIN uses the previously developed surface burning reaction model which better represents the shock initiation process in TATB, than do bulk reaction models. A large number of computed simulations prove the code is a reliable and efficient tool for shock initiation studies. A parametric study shows the effect on build-up and run distance to detonation of (1) type of boundary condtion, (2) burning velocity curve, (3) shock duration, (4) rise timemore » in ramp loading, (5) initial density (or porosity) of the explosive, (6) initial temperature, and (7) grain size. 29 refs., 65 figs.« less
ERIC Educational Resources Information Center
Santally, Mohammad Issack; Cooshna-Naik, Dorothy; Conruyt, Noel; Wing, Caroline Koa
2015-01-01
This paper describes a social partnership model based on the living lab concept to promote the professional development of educators through formal and informal capacity-building initiatives. The aim is to have a broader impact on society through community outreach educational initiatives. A Living Lab is an environment for user-centered…
Williams, Dustin L.; Haymond, Bryan S.; Woodbury, Kassie L.; Beck, J. Peter; Moore, David E.; Epperson, R. Tyler; Bloebaum, Roy D.
2012-01-01
Currently, the majority of animal models that are used to study biofilm-related infections utilize planktonic bacterial cells as initial inocula to produce positive signals of infection in biomaterials studies. However, the use of planktonic cells has potentially led to inconsistent results in infection outcomes. In this study, well-established biofilms of methicillin-resistant Staphylococcus aureus (MRSA) were grown and used as initial inocula in an animal model of a Type IIIB open fracture. The goal of the work was to establish, for the first time, a repeatable model of biofilm implant-related osteomyelitis wherein biofilms were used as initial inocula to test combination biomaterials. Results showed that 100% of animals that were treated with biofilms developed osteomyelitis, whereas 0% of animals not treated with biofilm developed infection. The development of this experimental model may lead to an important shift in biofilm and biomaterials research by showing that when biofilms are used as initial inocula, they may provide additional insights into how biofilm-related infections in the clinic develop and how they can be treated with combination biomaterials to eradicate and/or prevent biofilm formation. PMID:22492534
Williams, Dustin L; Haymond, Bryan S; Woodbury, Kassie L; Beck, J Peter; Moore, David E; Epperson, R Tyler; Bloebaum, Roy D
2012-07-01
Currently, the majority of animal models that are used to study biofilm-related infections use planktonic bacterial cells as initial inocula to produce positive signals of infection in biomaterials studies. However, the use of planktonic cells has potentially led to inconsistent results in infection outcomes. In this study, well-established biofilms of methicillin-resistant Staphylococcus aureus were grown and used as initial inocula in an animal model of a Type IIIB open fracture. The goal of the work was to establish, for the first time, a repeatable model of biofilm implant-related osteomyelitis, wherein biofilms were used as initial inocula to test combination biomaterials. Results showed that 100% of animals that were treated with biofilms developed osteomyelitis, whereas 0% of animals not treated with biofilm developed infection. The development of this experimental model may lead to an important shift in biofilm and biomaterials research by showing that when biofilms are used as initial inocula, they may provide additional insights into how biofilm-related infections in the clinic develop and how they can be treated with combination biomaterials to eradicate and/or prevent biofilm formation. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Mathematical modeling of forest fire initiation in three dimensional setting
Valeriy Perminov
2007-01-01
In this study, the assignment and theoretical investigations of the problems of forest fire initiation were carried out, including development of a mathematical model for description of heat and mass transfer processes in overterrestrial layer of atmosphere at crown forest fire initiation, taking into account their mutual influence. Mathematical model of forest fire...
Devil is in the details: Using logic models to investigate program process.
Peyton, David J; Scicchitano, Michael
2017-12-01
Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.
Asquith, William H.; Roussel, Meghan C.
2007-01-01
Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb
A BRIDGE partnership model for health management education in the Slovak Republic.
West, D J; Krcmery, V; Rusnakova, V; Murgas, M
1998-01-01
An innovative Health Management Education Partnership (HMEP) was initiated to develop management education initiatives through the exchange of information and ideas. Health education efforts, projects and activities exist between the University of Scranton and three strategic partners in the Slovak Republic: Trnava University, the Health Management School and the University of Matej Bel. The BRIDGE model (Building Relationships in Developing and Growing Economies) utilizes several innovative educational initiatives and strategic projects including a professional journal, faculty development, professional development, curriculum development, certification and accreditation, faculty-students exchange and development of educational materials and modules. The BRIDGE organizational structure is reviewed as well as specific workplan objectives to operationalize the HMEP encouraging mutual cooperation, collaboration and sustainability of efforts. The model stresses implementation, monitoring, and evaluation of all initiatives through a strong community effort, focus on research, deployment of educational resources, curriculum modification, development of interpartnership activities, conferences, workshops, fieldwork experiences and study tours. Applied management practices enhance market-oriented solutions to health care delivery problems emphasizing a focus on privatization and entrepreneurship through education.
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
Calibrating reaction rates for the CREST model
NASA Astrophysics Data System (ADS)
Handley, Caroline A.; Christie, Michael A.
2017-01-01
The CREST reactive-burn model uses entropy-dependent reaction rates that, until now, have been manually tuned to fit shock-initiation and detonation data in hydrocode simulations. This paper describes the initial development of an automatic method for calibrating CREST reaction-rate coefficients, using particle swarm optimisation. The automatic method is applied to EDC32, to help develop the first CREST model for this conventional high explosive.
A model for a knowledge-based system's life cycle
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
Artificial Intelligence Software Engineering (AISE) model
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
ERIC Educational Resources Information Center
Wingate, Ursula
2012-01-01
Three writing development initiatives carried out at King's College London UK are discussed in this article to illustrate the need to draw on different theoretical models to create effective methods of teaching academic writing. The sequence of initiatives resembles a journey: the destination is to develop academic writing programmes suitable for…
ERIC Educational Resources Information Center
Clemens, Elysia V.; Carey, John C.; Harrington, Karen M.
2010-01-01
This article details the initial development of the School Counseling Program Implementation Survey and psychometric results including reliability and factor structure. An exploratory factor analysis revealed a three-factor model that accounted for 54% of the variance of the intercorrelation matrix and a two-factor model that accounted for 47% of…
A new funding model for nursing education through business development initiatives.
Broome, Marion E; Bowersox, Dave; Relf, Michael
Public and private higher education funding models are shifting from traditional funding of schools and departments to a model in which schools increasingly rely on revenue other than tuition to fulfill and supplement activities related to their core missions. In this paper we discuss what nursing deans need to know about non tuition funding in this contemporary paradigm. We focus on how the Duke University School of Nursing created a Business Development Initiative (BDI) that provides additional revenue to help meets the financial needs of its' programs while nurturing the entrepreneurial spirit of faculty and staff. This BDI holds promise as a model that can be adapted by other schools seeking to support education, research and professional development initiatives without relying solely on tuition, tax dollars, endowments and/or grants. Copyright © 2017 Elsevier Inc. All rights reserved.
Modeling Initiation in Exploding Bridgewire Detonators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hrousis, C A
2005-05-18
One- and two-dimensional models of initiation in detonators are being developed for the purpose of evaluating the performance of aged and modified detonator designs. The models focus on accurate description of the initiator, whether it be an EBW (exploding bridgewire) that directly initiates a high explosive powder or an EBF (exploding bridgefoil) that sends an inert flyer into a dense HE pellet. The explosion of the initiator is simulated using detailed MHD equations of state as opposed to specific action-based phenomenological descriptions. The HE is modeled using the best available JWL equations of state. Results to date have been promising,more » however, work is still in progress.« less
Predictive Models of Duration of Ground Delay Programs in New York Area Airports
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
2011-01-01
Initially planned GDP duration often turns out to be an underestimate or an overestimate of the actual GDP duration. This, in turn, results in avoidable airborne or ground delays in the system. Therefore, better models of actual duration have the potential of reducing delays in the system. The overall objective of this study is to develop such models based on logs of GDPs. In a previous report, we described descriptive models of Ground Delay Programs. These models were defined in terms of initial planned duration and in terms of categorical variables. These descriptive models are good at characterizing the historical errors in planned GDP durations. This paper focuses on developing predictive models of GDP duration. Traffic Management Initiatives (TMI) are logged by Air Traffic Control facilities with The National Traffic Management Log (NTML) which is a single system for automated recoding, coordination, and distribution of relevant information about TMIs throughout the National Airspace System. (Brickman, 2004 Yuditsky, 2007) We use 2008-2009 GDP data from the NTML database for the study reported in this paper. NTML information about a GDP includes the initial specification, possibly one or more revisions, and the cancellation. In the next section, we describe general characteristics of Ground Delay Programs. In the third section, we develop models of actual duration. In the fourth section, we compare predictive performance of these models. The final section is a conclusion.
Wagener, T.; Hogue, T.; Schaake, J.; Duan, Q.; Gupta, H.; Andreassian, V.; Hall, A.; Leavesley, G.
2006-01-01
The Model Parameter Estimation Experiment (MOPEX) is an international project aimed at developing enhanced techniques for the a priori estimation of parameters in hydrological models and in land surface parameterization schemes connected to atmospheric models. The MOPEX science strategy involves: database creation, a priori parameter estimation methodology development, parameter refinement or calibration, and the demonstration of parameter transferability. A comprehensive MOPEX database has been developed that contains historical hydrometeorological data and land surface characteristics data for many hydrological basins in the United States (US) and in other countries. This database is being continuously expanded to include basins from various hydroclimatic regimes throughout the world. MOPEX research has largely been driven by a series of international workshops that have brought interested hydrologists and land surface modellers together to exchange knowledge and experience in developing and applying parameter estimation techniques. With its focus on parameter estimation, MOPEX plays an important role in the international context of other initiatives such as GEWEX, HEPEX, PUB and PILPS. This paper outlines the MOPEX initiative, discusses its role in the scientific community, and briefly states future directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zili; Nordhaus, William
2009-03-19
In the duration of this project, we finished the main tasks set up in the initial proposal. These tasks include: setting up the basic platform in GAMS language for the new RICE 2007 model; testing various model structure of RICE 2007; incorporating PPP data set in the new RICE model; developing gridded data set for IA modeling.
PHYSICAL COAL-CLEANING/FLUE GAS DESULFURIZATION COMPUTER MODEL
The model consists of four programs: (1) one, initially developed by Battell-Columbus Laboratories, obtained from Versar, Inc.; (2) one developed by TVA; and (3,4) two developed by TVA and Bechtel National, Inc. The model produces design performance criteria and estimates of capi...
Initial component control in disparity vergence: a model-based study.
Horng, J L; Semmlow, J L; Hung, G K; Ciuffreda, K J
1998-02-01
The dual-mode theory for the control of disparity-vergence eye movements states that two components control the response to a step change in disparity. The initial component uses a motor preprogram to drive the eyes to an approximate final position. This initial component is followed by activation of a late component operating under visual feedback control that reduces residual disparity to within fusional limits. A quantitative model based on a pulse-step controller, similar to that postulated for saccadic eye movements, has been developed to represent the initial component. This model, an adaptation of one developed by Zee et al. [1], provides accurate simulations of isolated initial component movements and is compatible with the known underlying neurophysiology and existing neurophysiological data. The model has been employed to investigate the difference in dynamics between convergent and divergent movements. Results indicate that the pulse-control component active in convergence is reduced or absent from the control signals of divergence movements. This suggests somewhat different control structures of convergence versus divergence, and is consistent with other directional asymmetries seen in horizontal vergence.
NASA Astrophysics Data System (ADS)
Gao, Chuan; Zhang, Rong-Hua; Wu, Xinrong; Sun, Jichang
2018-04-01
Large biases exist in real-time ENSO prediction, which can be attributed to uncertainties in initial conditions and model parameters. Previously, a 4D variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer ( T e), which is empirically and explicitly related to sea level (SL) variation. The strength of the thermocline effect on SST (referred to simply as "the thermocline effect") is represented by an introduced parameter, α Te. A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having their initial condition optimized only, and having their initial condition plus this additional model parameter optimized, are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameters and initial conditions together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.
Using a 4D-Variational Method to Optimize Model Parameters in an Intermediate Coupled Model of ENSO
NASA Astrophysics Data System (ADS)
Gao, C.; Zhang, R. H.
2017-12-01
Large biases exist in real-time ENSO prediction, which is attributed to uncertainties in initial conditions and model parameters. Previously, a four dimentional variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer (Te), which is empirically and explicitly related to sea level (SL) variation, written as Te=αTe×FTe (SL). The introduced parameter, αTe, represents the strength of the thermocline effect on sea surface temperature (SST; referred as the thermocline effect). A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having initial condition optimized only and having initial condition plus this additional model parameter optimized both are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameter and initial condition together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.
Detailed model for practical pulverized coal furnaces and gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, P.J.; Smoot, L.D.
1989-08-01
This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report:more » (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.« less
Madan-Swain, Avi; Hankins, Shirley L; Gilliam, Margaux Barnes; Ross, Kelly; Reynolds, Nina; Milby, Jesse; Schwebel, David C
2012-03-01
This article considers the development of research competencies in professional psychology and how that movement might be applied to training in pediatric psychology. The field of pediatric psychology has a short but rich history, and experts have identified critical competencies. However, pediatric psychology has not yet detailed a set of research-based competencies. This article initially reviews the competency initiative in professional psychology, including the cube model as it relates to research training. Next, we review and adapt the knowledge-based/foundational and applied/functional research competencies proposed by health psychology into a cube model for pediatric psychology. We focus especially on graduate-level training but allude to its application throughout professional development. We present the cube model as it is currently being applied to the development of a systematic research competency evaluation for graduate training at our medical/clinical psychology doctoral program at the University of Alabama at Birmingham. Based on the review and synthesis of the literature on research competency in professional psychology we propose future initiatives to develop these competencies for the field of pediatric psychology. The cube model can be successfully applied to the development of research training competencies in pediatric psychology. Future research should address the development, implementation, and assessment of the research competencies for training and career development of future pediatric psychologists.
Chappell, Stacie; Pescud, Melanie; Waterworth, Pippa; Shilton, Trevor; Roche, Dee; Ledger, Melissa; Slevin, Terry; Rosenberg, Michael
2016-10-01
The aim of this study was to use Kotter's leading change model to explore the implementation of workplace health and wellbeing initiatives. Qualitative interviews were conducted with 31 workplace representatives with a healthy workplace initiative. None of the workplaces used a formal change management model when implementing their healthy workplace initiatives. Not all of the steps in Kotter model were considered necessary and the order of the steps was challenged. For example, interviewees perceived that communicating the vision, developing the vision, and creating a guiding coalition were integral parts of the process, although there was less emphasis on the importance of creating a sense of urgency and consolidating change. Although none of the workplaces reported using a formal organizational change model when implementing their healthy workplace initiatives, there did appear to be perceived merit in using the steps in Kotter's model.
Next generation initiation techniques
NASA Technical Reports Server (NTRS)
Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans
1993-01-01
Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.
Aerosol Modeling for the Global Model Initiative
NASA Technical Reports Server (NTRS)
Weisenstein, Debra K.; Ko, Malcolm K. W.
2001-01-01
The goal of this project is to develop an aerosol module to be used within the framework of the Global Modeling Initiative (GMI). The model development work will be preformed jointly by the University of Michigan and AER, using existing aerosol models at the two institutions as starting points. The GMI aerosol model will be tested, evaluated against observations, and then applied to assessment of the effects of aircraft sulfur emissions as needed by the NASA Subsonic Assessment in 2001. The work includes the following tasks: 1. Implementation of the sulfur cycle within GMI, including sources, sinks, and aqueous conversion of sulfur. Aerosol modules will be added as they are developed and the GMI schedule permits. 2. Addition of aerosol types other than sulfate particles, including dust, soot, organic carbon, and black carbon. 3. Development of new and more efficient parameterizations for treating sulfate aerosol nucleation, condensation, and coagulation among different particle sizes and types.
Revisiting Shock Initiation Modeling of Homogeneous Explosives
NASA Astrophysics Data System (ADS)
Partom, Yehuda
2013-04-01
Shock initiation of homogeneous explosives has been a subject of research since the 1960s, with neat and sensitized nitromethane as the main materials for experiments. A shock initiation model of homogeneous explosives was established in the early 1960s. It involves a thermal explosion event at the shock entrance boundary, which develops into a superdetonation that overtakes the initial shock. In recent years, Sheffield and his group, using accurate experimental tools, were able to observe details of buildup of the superdetonation. There are many papers on modeling shock initiation of heterogeneous explosives, but there are only a few papers on modeling shock initiation of homogeneous explosives. In this article, bulk reaction reactive flow equations are used to model homogeneous shock initiation in an attempt to reproduce experimental data of Sheffield and his group. It was possible to reproduce the main features of the shock initiation process, including thermal explosion, superdetonation, input shock overtake, overdriven detonation after overtake, and the beginning of decay toward Chapman-Jouget (CJ) detonation. The time to overtake (TTO) as function of input pressure was also calculated and compared to the experimental TTO.
Human Cancer Models Initiative | Office of Cancer Genomics
The Human Cancer Models Initiative (HCMI) is an international consortium that is generating novel human tumor-derived culture models, which are annotated with genomic and clinical data. In an effort to advance cancer research and more fully understand how in vitro findings are related to clinical biology, HCMI-developed models and related data will be available as a community resource for cancer and other research.
Human Cancer Models Initiative | Office of Cancer Genomics
The Human Cancer Models Initiative (HCMI) is an international consortium that is generating novel human tumor-derived culture models, which are annotated with genomic and clinical data. In an effort to advance cancer research and more fully understand how in vitro findings are related to clinical biology, HCMI-developed models and related data will be available as a community resource for cancer research.
A computer model for the recombination zone of a microwave-plasma electrothermal rocket
NASA Technical Reports Server (NTRS)
Filpus, John W.; Hawley, Martin C.
1987-01-01
As part of a study of the microwave-plasma electrothermal rocket, a computer model of the flow regime below the plasma has been developed. A second-order model, including axial dispersion of energy and material and boundary conditions at infinite length, was developed to partially reproduce the absence of mass-flow rate dependence that was seen in experimental temperature profiles. To solve the equations of the model, a search technique was developed to find the initial derivatives. On integrating with a trial set of initial derivatives, the values and their derivatives were checked to judge whether the values were likely to attain values outside the practical regime, and hence, the boundary conditions at infinity were likely to be violated. Results are presented and directions for further development are suggested.
NASA Astrophysics Data System (ADS)
Sukmawati, Zuhairoh, Faihatuz
2017-05-01
The purpose of this research was to develop authentic assessment model based on showcase portfolio on learning of mathematical problem solving. This research used research and development Method (R & D) which consists of four stages of development that: Phase I, conducting a preliminary study. Phase II, determining the purpose of developing and preparing the initial model. Phase III, trial test of instrument for the initial draft model and the initial product. The respondents of this research are the students of SMAN 8 and SMAN 20 Makassar. The collection of data was through observation, interviews, documentation, student questionnaire, and instrument tests mathematical solving abilities. The data were analyzed with descriptive and inferential statistics. The results of this research are authentic assessment model design based on showcase portfolio which involves: 1) Steps in implementing the authentic assessment based Showcase, assessment rubric of cognitive aspects, assessment rubric of affective aspects, and assessment rubric of skill aspect. 2) The average ability of the students' problem solving which is scored by using authentic assessment based on showcase portfolio was in high category and the students' response in good category.
NASA Astrophysics Data System (ADS)
Park, Kyungjeen
This study aims to develop an objective hurricane initialization scheme which incorporates not only forecast model constraints but also observed features such as the initial intensity and size. It is based on the four-dimensional variational (4D-Var) bogus data assimilation (BDA) scheme originally proposed by Zou and Xiao (1999). The 4D-Var BDA consists of two steps: (i) specifying a bogus sea level pressure (SLP) field based on parameters observed by the Tropical Prediction Center (TPC) and (ii) assimilating the bogus SLP field under a forecast model constraint to adjust all model variables. This research focuses on improving the specification of the bogus SLP indicated in the first step. Numerical experiments are carried out for Hurricane Bonnie (1998) and Hurricane Gordon (2000) to test the sensitivity of hurricane track and intensity forecasts to specification of initial vortex. Major results are listed below: (1) A linear regression model is developed for determining the size of initial vortex based on the TPC observed radius of 34kt. (2) A method is proposed to derive a radial profile of SLP from QuikSCAT surface winds. This profile is shown to be more realistic than ideal profiles derived from Fujita's and Holland's formulae. (3) It is found that it takes about 1 h for hurricane prediction model to develop a conceptually correct hurricane structure, featuring a dominant role of hydrostatic balance at the initial time and a dynamic adjustment in less than 30 minutes. (4) Numerical experiments suggest that track prediction is less sensitive to the specification of initial vortex structure than intensity forecast. (5) Hurricane initialization using QuikSCAT-derived initial vortex produced a reasonably good forecast for hurricane landfall, with a position error of 25 km and a 4-h delay at landfalling. (6) Numerical experiments using the linear regression model for the size specification considerably outperforms all the other formulations tested in terms of the intensity prediction for both Hurricanes. For examples, the maximum track error is less than 110 km during the entire three-day forecasts for both hurricanes. The simulated Hurricane Gordon using the linear regression model made a nearly perfect landfall, with no position error and only 1-h error in landfalling time. (7) Diagnosis of model output indicates that the initial vortex specified by the linear regression model produces larger surface fluxes of sensible heat, latent heat and moisture, as well as stronger downward angular momentum transport than all the other schemes do. These enhanced energy supplies offset the energy lost caused by friction and gravity wave propagation, allowing for the model to maintain a strong and realistic hurricane during the entire forward model integration.
Henderson, E; Rubin, G
2014-03-01
(i) To explore dental, school and family perspectives of an oral health promotion (OHP) initiative to improve access for pre-school children in deprived communities; (ii) to develop a model of roles and responsibilities for OHP in community settings. Semi-structured focus groups (n = 6) with dental practice staff (n = 24), and semi-structured interviews with school staff (n = 9) and parents and children (n = 4) who were involved in an OHP initiative for pre-school children. Framework analysis was applied to identify themes. Themes were used to develop a model of roles and responsibilities for OHP, based on the WHO Planning and evaluating health promotion model. Respondents subscribed to a community-based approach to improving access to dental services for pre-school children in deprived areas, with an emphasis on shared responsibility and communication. In addition to macro-level actions in directing health policy and services, commissioners were held responsible for investing in micro-level actions, such as funding OHP training and involving parents, and meso-level actions such as reducing barriers to access. The model we have developed builds on WHO recommendations on health promotion to identify the key roles and responsibilities that should be incorporated into further initiatives in OHP.
Finite Element Modeling of In-Situ Stresses near Salt Bodies
NASA Astrophysics Data System (ADS)
Sanz, P.; Gray, G.; Albertz, M.
2011-12-01
The in-situ stress field is modified around salt bodies because salt rock has no ability to sustain shear stresses. A reliable prediction of stresses near salt is important for planning safe and economic drilling programs. A better understanding of in-situ stresses before drilling can be achieved using finite element models that account for the creeping salt behavior and the elastoplastic response of the surrounding sediments. Two different geomechanical modeling techniques can be distinguished: "dynamic" modeling and "static" modeling. "Dynamic" models, also known as forward models, simulate the development of structural processes in geologic time. This technique provides the evolution of stresses and so it is used to simulate the initiation and development of structural features, such as, faults, folds, fractures, and salt diapers. The original or initial configuration and the unknown final configuration of forward models are usually significantly different therefore geometric non-linearities need to be considered. These models may be difficult to constrain when different tectonic, deposition, and erosion events, and the timing among them, needs to be accounted for. While dynamic models provide insight into the stress evolution, in many cases is very challenging, if not impossible, to forward model a configuration to its known present-day geometry; particularly in the case of salt layers that evolve into highly irregular and complex geometries. Alternatively, "static" models use the present-day geometry and present-day far-field stresses to estimate the present-day in-situ stress field inside a domain. In this case, it is appropriate to use a small deformation approach because initial and final configurations should be very similar, and more important, because the equilibrium of stresses should be stated in the present-day initial configuration. The initial stresses and the applied boundary conditions are constrained by the geologic setting and available data. This modeling technique does not predict the evolution of structural elements or stresses with time; therefore it does not provide any insight into the formation of fractures that were previously developed under a different stress condition or the development of overpressure generated by a high sedimentation rate. This work provides a validation for predicting in-situ stresses near salt using "static" models. We compare synthetic examples using both modeling techniques and show that stresses near salt predicted with "static" models are comparable to the ones generated by "dynamic" models.
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1990-01-01
The continued development and improvement of the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code, the incorporation in a coupled manner of radiation models into the VSL code, and the initial development of appropriate precursor models are presented.
Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings
NASA Astrophysics Data System (ADS)
Tiryakioğlu, Murat
2009-07-01
A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.
MAGI: many-component galaxy initializer
NASA Astrophysics Data System (ADS)
Miki, Yohei; Umemura, Masayuki
2018-04-01
Providing initial conditions is an essential procedure for numerical simulations of galaxies. The initial conditions for idealized individual galaxies in N-body simulations should resemble observed galaxies and be dynamically stable for time-scales much longer than their characteristic dynamical times. However, generating a galaxy model ab initio as a system in dynamical equilibrium is a difficult task, since a galaxy contains several components, including a bulge, disc, and halo. Moreover, it is desirable that the initial-condition generator be fast and easy to use. We have now developed an initial-condition generator for galactic N-body simulations that satisfies these requirements. The developed generator adopts a distribution-function-based method, and it supports various kinds of density models, including custom-tabulated inputs and the presence of more than one disc. We tested the dynamical stability of systems generated by our code, representing early- and late-type galaxies, with N = 2097 152 and 8388 608 particles, respectively, and we found that the model galaxies maintain their initial distributions for at least 1 Gyr. The execution times required to generate the two models were 8.5 and 221.7 seconds, respectively, which is negligible compared to typical execution times for N-body simulations. The code is provided as open-source software and is publicly and freely available at https://bitbucket.org/ymiki/magi.
Greenland Regional and Ice Sheet-wide Geometry Sensitivity to Boundary and Initial conditions
NASA Astrophysics Data System (ADS)
Logan, L. C.; Narayanan, S. H. K.; Greve, R.; Heimbach, P.
2017-12-01
Ice sheet and glacier model outputs require inputs from uncertainly known initial and boundary conditions, and other parameters. Conservation and constitutive equations formalize the relationship between model inputs and outputs, and the sensitivity of model-derived quantities of interest (e.g., ice sheet volume above floatation) to model variables can be obtained via the adjoint model of an ice sheet. We show how one particular ice sheet model, SICOPOLIS (SImulation COde for POLythermal Ice Sheets), depends on these inputs through comprehensive adjoint-based sensitivity analyses. SICOPOLIS discretizes the shallow-ice and shallow-shelf approximations for ice flow, and is well-suited for paleo-studies of Greenland and Antarctica, among other computational domains. The adjoint model of SICOPOLIS was developed via algorithmic differentiation, facilitated by the source transformation tool OpenAD (developed at Argonne National Lab). While model sensitivity to various inputs can be computed by costly methods involving input perturbation simulations, the time-dependent adjoint model of SICOPOLIS delivers model sensitivities to initial and boundary conditions throughout time at lower cost. Here, we explore both the sensitivities of the Greenland Ice Sheet's entire and regional volumes to: initial ice thickness, precipitation, basal sliding, and geothermal flux over the Holocene epoch. Sensitivity studies such as described here are now accessible to the modeling community, based on the latest version of SICOPOLIS that has been adapted for OpenAD to generate correct and efficient adjoint code.
Bioprinting technologies for disease modeling.
Memic, Adnan; Navaei, Ali; Mirani, Bahram; Cordova, Julio Alvin Vacacela; Aldhahri, Musab; Dolatshahi-Pirouz, Alireza; Akbari, Mohsen; Nikkhah, Mehdi
2017-09-01
There is a great need for the development of biomimetic human tissue models that allow elucidation of the pathophysiological conditions involved in disease initiation and progression. Conventional two-dimensional (2D) in vitro assays and animal models have been unable to fully recapitulate the critical characteristics of human physiology. Alternatively, three-dimensional (3D) tissue models are often developed in a low-throughput manner and lack crucial native-like architecture. The recent emergence of bioprinting technologies has enabled creating 3D tissue models that address the critical challenges of conventional in vitro assays through the development of custom bioinks and patient derived cells coupled with well-defined arrangements of biomaterials. Here, we provide an overview on the technological aspects of 3D bioprinting technique and discuss how the development of bioprinted tissue models have propelled our understanding of diseases' characteristics (i.e. initiation and progression). The future perspectives on the use of bioprinted 3D tissue models for drug discovery application are also highlighted.
Yu Wei; Michael Bevers; Erin Belval; Benjamin Bird
2015-01-01
This research developed a chance-constrained two-stage stochastic programming model to support wildfire initial attack resource acquisition and location on a planning unit for a fire season. Fire growth constraints account for the interaction between fire perimeter growth and construction to prevent overestimation of resource requirements. We used this model to examine...
Zhao, Yue; Liu, Zhiyong; Liu, Chenfeng; Hu, Zhipeng
2017-01-01
Microalgae are considered to be a potential major biomass feedstock for biofuel due to their high lipid content. However, no correlation equations as a function of initial nitrogen concentration for lipid accumulation have been developed for simplicity to predict lipid production and optimize the lipid production process. In this study, a lipid accumulation model was developed with simple parameters based on the assumption protein synthesis shift to lipid synthesis by a linear function of nitrogen quota. The model predictions fitted well for the growth, lipid content, and nitrogen consumption of Coelastrum sp. HA-1 under various initial nitrogen concentrations. Then the model was applied successfully in Chlorella sorokiniana to predict the lipid content with different light intensities. The quantitative relationship between initial nitrogen concentrations and the final lipid content with sensitivity analysis of the model were also discussed. Based on the model results, the conversion efficiency from protein synthesis to lipid synthesis is higher and higher in microalgae metabolism process as nitrogen decreases; however, the carbohydrate composition content remains basically unchanged neither in HA-1 nor in C. sorokiniana. PMID:28194424
Improving Teaching through Lesson Study
ERIC Educational Resources Information Center
Rock, Tracy C.; Wilson, Cathy
2005-01-01
This article presents a professional development initiative developed by a university-school partnership based on the Japanese lesson-study model described by Stigler and Hiebert (1999) in "The Teaching Gap." Lesson study ("jugyoukenkyu"), an inquiry model of teacher professional development, is used extensively throughout…
A probabilistic drought forecasting framework: A combined dynamical and statistical approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh
In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less
Developing a Curriculum for Initial Teacher Education Using a Situated Learning Perspective
ERIC Educational Resources Information Center
Skinner, Nigel
2010-01-01
This paper argues that the implications of the concept of situated learning are important when developing a curriculum for initial teacher education (ITE). It describes and analyses the use of a model of ITE designed to stimulate discussions promoting the development of professional craft knowledge situated mainly in schools and to connect these…
ERIC Educational Resources Information Center
Cooley, Sam J.; Burns, Victoria E.; Cumming, Jennifer
2016-01-01
This study investigates the initial development of groupwork skills through outdoor adventure education (OAE) and the factors that predict the extent of this development, using the first two levels of Kirkpatrick's model of training evaluation. University students (N = 238) completed questionnaires measuring their initial reactions to OAE (Level 1…
Initial conditions and ENSO prediction using a coupled ocean-atmosphere model
NASA Astrophysics Data System (ADS)
Larow, T. E.; Krishnamurti, T. N.
1998-01-01
A coupled ocean-atmosphere initialization scheme using Newtonian relaxation has been developed for the Florida State University coupled ocean-atmosphere global general circulation model. The initialization scheme is used to initialize the coupled model for seasonal forecasting the boreal summers of 1987 and 1988. The atmosphere model is a modified version of the Florida State University global spectral model, resolution T-42. The ocean general circulation model consists of a slightly modified version of the Hamburg's climate group model described in Latif (1987) and Latif et al. (1993). The coupling is synchronous with information exchanged every two model hours. Using ECMWF atmospheric daily analysis and observed monthly mean SSTs, two, 1-year, time-dependent, Newtonian relaxation were performed using the coupled model prior to conducting the seasonal forecasts. The coupled initializations were conducted from 1 June 1986 to 1 June 1987 and from 1 June 1987 to 1 June 1988. Newtonian relaxation was applied to the prognostic atmospheric vorticity, divergence, temperature and dew point depression equations. In the ocean model the relaxation was applied to the surface temperature. Two, 10-member ensemble integrations were conducted to examine the impact of the coupled initialization on the seasonal forecasts. The initial conditions used for the ensembles are the ocean's final state after the initialization and the atmospheric initial conditions are ECMWF analysis. Examination of the SST root mean square error and anomaly correlations between observed and forecasted SSTs in the Niño-3 and Niño-4 regions for the 2 seasonal forecasts, show closer agreement between the initialized forecast than two, 10-member non-initialized ensemble forecasts. The main conclusion here is that a single forecast with the coupled initialization outperforms, in SST anomaly prediction, against each of the control forecasts (members of the ensemble) which do not include such an initialization, indicating possible importance for the inclusion of the atmosphere during the coupled initialization.
The U.S. EPA Atlantic Ecology Division (AED) has initiated a multi-year research program to develop empirical nitrogen load-response models. Our research on embayments in southern New England is part of a multi-regional effort to develop cause-effect models for the Gulf of Mexic...
A Methodology for Cybercraft Requirement Definition and Initial System Design
2008-06-01
the software development concepts of the SDLC , requirements, use cases and domain modeling . It ...collectively as Software Development 5 Life Cycle ( SDLC ) models . While there are numerous models that fit under the SDLC definition, all are based on... developed that provided expanded understanding of the domain, it is necessary to either update an existing domain model or create another domain
ERIC Educational Resources Information Center
Virolainen, M.; Persson Thunqvist, D.
2017-01-01
The Nordic countries are often referred to as a group even though their education systems and training models are very different. The aim of this study is to advance understanding of those differences and compare the developments and organisation of initial vocational education and training (IVET) in Finland and Sweden since the 1990s as examples…
The Air Quality Model Evaluation International Initiative ...
This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
NASA Astrophysics Data System (ADS)
Temnikov, A. G.; Chernensky, L. L.; Orlov, A. V.; Lysov, N. Y.; Zhuravkova, D. S.; Belova, O. S.; Gerastenok, T. K.
2017-12-01
The results of the experimental application of artificial thunderstorm cells of negative and positive polarities for the investigation of the lightning initiation problems between the thundercloud and the ground using model hydrometeor arrays are presented. Possible options of the initiation and development of a discharge between the charged cloud and the ground in the presence of model hydrometeors are established. It is experimentally shown that groups of large hydrometeors of various shapes significantly increase the probability of channel discharge initiation between the artificial thunderstorm cell and the ground, especially in the case of positive polarity of the cloud. The authors assume that large hail arrays in the thundercloud can initiate the preliminary breakdown stage in the lower part of the thundercloud or initiate and stimulate the propagation of positive lightning from its upper part. A significant effect of the shape of model hydrometeors and the way they are grouped on the processes of initiation and stimulation of the channel discharge propagation in the artificial thunderstorm cell of negative or positive polarity-ground gap is experimentally established. It is found that, in the case of negative polarity of a charged cloud, the group of conductive cylindrical hydrometeors connected by a dielectric string more effectively initiates the channel discharge between the artificial thunderstorm cell and the ground. In the case of positive polarity of the artificial thunderstorm cell, the best effect of the channel discharge initiation is achieved for model hydrometeors grouped together by the dielectric tape. The obtained results can be used in the development of the method for the directed artificial lightning initiation between the thundercloud and the ground.
Seasonal simulations using a coupled ocean-atmosphere model with data assimilation
NASA Astrophysics Data System (ADS)
Larow, Timothy Edward
1997-10-01
A coupled ocean-atmosphere initialization scheme using Newtonian relaxation has been developed for the Florida State University coupled ocean-atmosphere global general circulation model. The coupled model is used for seasonal predictions of the boreal summers of 1987 and 1988. The atmosphere model is a modified version of the Florida State University global spectral model, resolution triangular truncation 42 waves. The ocean general circulation model consists of a slightly modified version developed by Latif (1987). Coupling is synchronous with exchange of information every two model hours. Using daily analysis from ECMWF and observed monthly mean SSTs from NCEP, two - one year, time dependent, Newtonian relaxation were conducted using the coupled model prior to the seasonal forecasts. Relaxation was selectively applied to the atmospheric vorticity, divergence, temperature, and dew point depression equations, and to the ocean's surface temperature equation. The ocean's initial conditions are from a six year ocean-only simulation which used observed wind stresses and a relaxation towards observed SSTs for forcings. Coupled initialization was conducted from 1 June 1986 to 1 June 1987 for the 1987 boreal forecast and from 1 June 1987 to 1 June 1988 for the 1988 boreal forecast. Examination of annual means of net heat flux, freshwater flux and wind stress obtained by from the initialization show close agreement with Oberhuber (1988) climatology and the Florida State University pseudo wind stress analysis. Sensitivity of the initialization/assimilation scheme was tested by conducting two - ten member ensemble integrations. Each member was integrated for 90 days (June-August) of the respective year. Initial conditions for the ensembles consisted of the same ocean state as used by the initialize forecasts, while the atmospheric initial conditions were from ECMWF analysis centered on 1 June of the respective year. Root mean square error and anomaly correlations between observed and forecasted SSTs in the Nino 3 and Nino 4 regions show greater skill between the initialized forecasts than the ensemble forecasts. It is hypothesized that differences in the specific humidity within the planetary boundary layer are responsible for the large SST errors noted with the ensembles.
UXO Burial Prediction Fidelity
2017-07-01
been developed to predict the initial penetration depth of underwater mines . SERDP would like to know if and how these existing mine models could be...designed for near-cylindrical mines —for munitions, however, projectile-specific drag, lift, and moment coefficients are needed for estimating...as inputs. Other models have been built to estimate these initial conditions for mines dropped into water. Can these mine models be useful for
Taking Workforce Initiatives to Scale: Workforce Initiatives Discussion Paper #2
ERIC Educational Resources Information Center
Academy for Educational Development, 2011
2011-01-01
The System-wide Collaborative Action for Livelihoods and Environment, or SCALE process, has become one of the Academy for Educational Development's (AED's) and the United States Agency for International Development's (USAID's) most utilized and replicated models, with applications in education, health, natural resources management, tourism,…
Liu, Xingwang; Bartholomew, Ezra; Cai, Yanling; Ren, Huazhong
2016-01-01
Trichomes are specialized epidermal cells located in aerial parts of plants that function in plant defense against biotic and abiotic stresses. The simple unicellular trichomes of Arabidopsis serve as an excellent model to study the molecular mechanism of cell differentiation and pattern formation in plants. Loss-of-function mutations in Arabidopsis thaliana have suggested that the core genes GL1 (which encodes a MYB transcription factor) and TTG1 (which encodes a WD40 repeat-containing protein) are important for the initiation and spacing of leaf trichomes, while for normal trichome initiation, the genes GL3, and EGL3 (which encode a bHLH protein) are needed. However, the positive regulatory genes involved in multicellular trichrome development in cucumber remain unclear. This review focuses on the phenotype of mutants (csgl3, tril, tbh, mict, and csgl1) with disturbed trichomes in cucumber and then infers which gene(s) play key roles in trichome initiation and development in those mutants. Evidence indicates that MICT, TBH, and CsGL1 are allelic with alternative splicing. CsGL3 and TRIL are allelic and override the effect of TBH, MICT, and CsGL1 on the regulation of multicellular trichome development; and affect trichome initiation. CsGL3, TRIL, MICT, TBH, and CsGL1 encode HD-Zip proteins with different subfamilies. Genetic and molecular analyses have revealed that CsGL3, TRIL, MICT, TBH, and CsGL1 are responsible for the differentiation of epidermal cells and the development of trichomes. Based on current knowledge, a positive regulator pathway model for trichome development in cucumber was proposed and compared to a model in Arabidopsis. These data suggest that trichome development in cucumber may differ from that in Arabidopsis. PMID:27559338
Numerical and analytical simulation of the production process of ZrO2 hollow particles
NASA Astrophysics Data System (ADS)
Safaei, Hadi; Emami, Mohsen Davazdah
2017-12-01
In this paper, the production process of hollow particles from the agglomerated particles is addressed analytically and numerically. The important parameters affecting this process, in particular, the initial porosity level of particles and the plasma gun types are investigated. The analytical model adopts a combination of quasi-steady thermal equilibrium and mechanical balance. In the analytical model, the possibility of a solid core existing in agglomerated particles is examined. In this model, a range of particle diameters (50μm ≤ D_{p0} ≤ 160 μ m) and various initial porosities ( 0.2 ≤ p ≤ 0.7) are considered. The numerical model employs the VOF technique for two-phase compressible flows. The production process of hollow particles from the agglomerated particles is simulated, considering an initial diameter of D_{p0} = 60 μm and initial porosity of p = 0.3, p = 0.5, and p = 0.7. Simulation results of the analytical model indicate that the solid core diameter is independent of the initial porosity, whereas the thickness of the particle shell strongly depends on the initial porosity. In both models, a hollow particle may hardly develop at small initial porosity values ( p < 0.3), while the particle disintegrates at high initial porosity values ( p > 0.6.
Comparison of different stomatal conductance algorithms for ozone flux modelling [Proceedings
P. Buker; L. D. Emberson; M. R. Ashmore; G. Gerosa; C. Jacobs; W. J. Massman; J. Muller; N. Nikolov; K. Novak; E. Oksanen; D. De La Torre; J. -P. Tuovinen
2006-01-01
The ozone deposition model (D03SE) that has been developed and applied within the EMEP photooxidant model (Emberson et al., 2000, Simpson et al. 2003) currently estimates stomatal ozone flux using a stomatal conductance (gs) model based on the multiplicative algorithm initially developed by Jarvis (1976). This model links gs to environmental and phenological parameters...
ERIC Educational Resources Information Center
Martin, Ian; Carey, John
2014-01-01
A logic model was developed based on an analysis of the 2012 American School Counselor Association (ASCA) National Model in order to provide direction for program evaluation initiatives. The logic model identified three outcomes (increased student achievement/gap reduction, increased school counseling program resources, and systemic change and…
Walking the Talk in Initial Teacher Education: Making Teacher Educator Modeling Effective
ERIC Educational Resources Information Center
Hogg, Linda; Yates, Anne
2013-01-01
This self-study investigated student teachers' perceptions of teacher educators modeling practices within a large lecture class in an initial teacher education program. It also studied factors that affected student teachers' developing ideas and practice. Phase 1 collected data from student teachers through focus group interviews and…
Assessing Space Weather Applications and Understanding: IMF Bz at L1
NASA Astrophysics Data System (ADS)
Riley, P.; Savani, N.; Mays, M. L.; Austin, H. J.
2017-12-01
The CCMC - International (CCMC-I) is designed as a self-organizing informal forum for facilitating novel global initiatives on space weather research, development, forecasting and education. Here we capitalize on CCMC'AGUs experience in providing highly utilized web-based services, leadership and trusted relationships with space weather model developers. One of the CCMC-I initiatives is the International Forum for Space Weather Capabilities Assessment. As part of this initiative, within the solar and heliosphere domain, we focus our community discussion on forecasting the magnetic structure of interplanetary CMEs and the ambient solar wind. During the International CCMC-LWS Working Meeting in April 2017 the group instigated open communication to agree upon a standardized process by which all current and future models can be compared under an unbiased test. In this poster, we present our initial findings how we expect different models will move forward with validating and forecasting the magnetic vectors of the solar wind at L1. We also present a new IMF Bz Score-board which will be used to assist in the transitioning of research models into more operational settings.
Inam, Azhar; Adamowski, Jan; Halbe, Johannes; Prasher, Shiv
2015-04-01
Over the course of the last twenty years, participatory modeling has increasingly been advocated as an integral component of integrated, adaptive, and collaborative water resources management. However, issues of high cost, time, and expertise are significant hurdles to the widespread adoption of participatory modeling in many developing countries. In this study, a step-wise method to initialize the involvement of key stakeholders in the development of qualitative system dynamics models (i.e. causal loop diagrams) is presented. The proposed approach is designed to overcome the challenges of low expertise, time and financial resources that have hampered previous participatory modeling efforts in developing countries. The methodological framework was applied in a case study of soil salinity management in the Rechna Doab region of Pakistan, with a focus on the application of qualitative modeling through stakeholder-built causal loop diagrams to address soil salinity problems in the basin. Individual causal loop diagrams were developed by key stakeholder groups, following which an overall group causal loop diagram of the entire system was built based on the individual causal loop diagrams to form a holistic qualitative model of the whole system. The case study demonstrates the usefulness of the proposed approach, based on using causal loop diagrams in initiating stakeholder involvement in the participatory model building process. In addition, the results point to social-economic aspects of soil salinity that have not been considered by other modeling studies to date. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel
2016-05-01
Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.
Faculty Integration of Technology in Teacher Preparation: Outcomes of a Development Model
ERIC Educational Resources Information Center
Judge, Sharon; O'Bannon, Blanche
2008-01-01
This article reports on a faculty development model that uses a variety of approaches and strategies to help faculty restructure their curricula and effectively model technology integration for their students. A multifaceted model, funded in part by the "Preparing Tomorrow's Teachers to Use Technology" (PT3) initiative, was implemented…
Comparison of simulation modeling and satellite techniques for monitoring ecological processes
NASA Technical Reports Server (NTRS)
Box, Elgene O.
1988-01-01
In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.
Weiss, Bahr; Ngo, Victoria Khanh; Dang, Hoang-Minh; Pollack, Amie; Trung, Lam T; Tran, Cong V; Tran, Nam T; Sang, David; Do, Khanh N
2012-01-01
Children and adolescents are among the highest need populations in regards to mental health support, especially in low and middle income countries (LMIC). Yet resources in LMIC for prevention and treatment of mental health problems are limited, in particular for children and adolescents. In this paper, we discuss a model for development of child and adolescent mental health (CAMH) resources in LMIC that has guided a ten year initiative focused on development of CAMH treatment and research infrastructure in Vietnam. We first review the need for development of mental health resources for children and adolescents in general, and then in Vietnam. We next present the model that guided our program as it developed, focused on the twin Capacity Development Goals of efficacy and sustainability, and the Capacity Development Targets used to move towards these goals. Finally we discuss our CAMH development initiative in Vietnam, the center of which has been development of a graduate program in clinical psychology at Vietnam National University, linking program activities to this model.
NASA Astrophysics Data System (ADS)
Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.
2012-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.
Numerical simulation of film-cooled ablative rocket nozzles
NASA Technical Reports Server (NTRS)
Landrum, D. B.; Beard, R. M.
1996-01-01
The objective of this research effort was to evaluate the impact of incorporating an additional cooling port downstream between the injector and nozzle throat in the NASA Fast Track chamber. A numerical model of the chamber was developed for the analysis. The analysis did not model ablation but instead correlated the initial ablation rate with the initial nozzle wall temperature distribution. The results of this study provide guidance in the development of a potentially lighter, second generation ablative rocket nozzle which maintains desired performance levels.
The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) has initiated a project to improve the methodology for modeling human exposure to motor vehicle emission. The overall project goal is to develop improved methods for modeling...
An Object-Based Requirements Modeling Method.
ERIC Educational Resources Information Center
Cordes, David W.; Carver, Doris L.
1992-01-01
Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…
Crack initiation modeling of a directionally-solidified nickel-base superalloy
NASA Astrophysics Data System (ADS)
Gordon, Ali Page
Combustion gas turbine components designed for application in electric power generation equipment are subject to periodic replacement as a result of cracking, damage, and mechanical property degeneration that render them unsafe for continued operation. In view of the significant costs associated with inspecting, servicing, and replacing damaged components, there has been much interest in developing models that not only predict service life, but also estimate the evolved microstructural state of the material. This thesis explains manifestations of microstructural damage mechanisms that facilitate fatigue crack nucleation in a newly-developed directionally-solidified (DS) Ni-base superalloy components exposed to elevated temperatures and high stresses. In this study, models were developed and validated for damage and life prediction using DS GTD-111 as the subject material. This material, proprietary to General Electric Energy, has a chemical composition and grain structure designed to withstand creep damage occurring in the first and second stage blades of gas-powered turbines. The service conditions in these components, which generally exceed 600°C, facilitate the onset of one or more damage mechanisms related to fatigue, creep, or environment. The study was divided into an empirical phase, which consisted of experimentally simulating service conditions in fatigue specimens, and a modeling phase, which entailed numerically simulating the stress-strain response of the material. Experiments have been carried out to simulate a variety of thermal, mechanical, and environmental operating conditions endured by longitudinally (L) and transversely (T) oriented DS GTD-111. Both in-phase and out-of-phase thermo-mechanical fatigue tests were conducted. In some cases, tests in extreme environments/temperatures were needed to isolate one or at most two of the mechanisms causing damage. Microstructural examinations were carried out via SEM and optical microscopy. A continuum crystal plasticity model was used to simulate the material behavior in the L and T orientations. The constitutive model was implemented in ABAQUS and a parameter estimation scheme was developed to obtain the material constants. A physically-based model was developed for correlating crack initiation life based on the experimental life data and predictions are made using the crack initiation model. Assuming a unique relationship between the damage fraction and cycle fraction with respect to cycles to crack initiation for each damage mode, the total crack initiation life has been represented in terms of the individual damage components (fatigue, creep-fatigue, creep, and oxidation-fatigue) observed at the end state of crack initiation.
Combustion Device Failures During Space Shuttle Main Engine Development
NASA Technical Reports Server (NTRS)
Goetz, Otto K.; Monk, Jan C.
2005-01-01
Major Causes: Limited Initial Materials Properties. Limited Structural Models - especially fatigue. Limited Thermal Models. Limited Aerodynamic Models. Human Errors. Limited Component Test. High Pressure. Complicated Control.
Effects of video modeling on social initiations by children with autism.
Nikopoulos, Christos K; Keenan, Michael
2004-01-01
We examined the effects of a video modeling intervention on social initiation and play behaviors with 3 children with autism using a multiple baseline across subjects design. Each child watched a videotape showing a typically developing peer, and the experimenter engaged in a simple social interactive play using one toy. For all children, social initiation and reciprocal play skills were enhanced, and these effects were maintained at 1- and 3-month follow-up periods.
Effects of video modeling on social initiations by children with autism.
Nikopoulos, Christos K; Keenan, Michael
2004-01-01
We examined the effects of a video modeling intervention on social initiation and play behaviors with 3 children with autism using a multiple baseline across subjects design. Each child watched a videotape showing a typically developing peer, and the experimenter engaged in a simple social interactive play using one toy. For all children, social initiation and reciprocal play skills were enhanced, and these effects were maintained at 1- and 3-month follow-up periods. PMID:15154221
Qualitative simulation for process modeling and control
NASA Technical Reports Server (NTRS)
Dalle Molle, D. T.; Edgar, T. F.
1989-01-01
A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.
Stochastic Modeling of Laminar-Turbulent Transition
NASA Technical Reports Server (NTRS)
Rubinstein, Robert; Choudhari, Meelan
2002-01-01
Stochastic versions of stability equations are developed in order to develop integrated models of transition and turbulence and to understand the effects of uncertain initial conditions on disturbance growth. Stochastic forms of the resonant triad equations, a high Reynolds number asymptotic theory, and the parabolized stability equations are developed.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Kurth, R. E.; Ho, H.
1986-01-01
A multiyear program is performed with the objective to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen (LOX) posts. Progress of the first year's effort includes completion of a sufficient portion of each task -- probabilistic models, code development, validation, and an initial operational code. This code has from its inception an expert system philosophy that could be added to throughout the program and in the future. The initial operational code is only applicable to turbine blade type loadings. The probabilistic model included in the operational code has fitting routines for loads that utilize a modified Discrete Probabilistic Distribution termed RASCAL, a barrier crossing method and a Monte Carlo method. An initial load model was developed by Battelle that is currently used for the slowly varying duty cycle type loading. The intent is to use the model and related codes essentially in the current form for all loads that are based on measured or calculated data that have followed a slowly varying profile.
High Level Information Fusion (HLIF) with nested fusion loops
NASA Astrophysics Data System (ADS)
Woodley, Robert; Gosnell, Michael; Fischer, Amber
2013-05-01
Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.
Distributive Education Competency-Based Curriculum Models by Occupational Clusters. Final Report.
ERIC Educational Resources Information Center
Davis, Rodney E.; Husted, Stewart W.
To meet the needs of distributive education teachers and students, a project was initiated to develop competency-based curriculum models for marketing and distributive education clusters. The models which were developed incorporate competencies, materials and resources, teaching methodologies/learning activities, and evaluative criteria for the…
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
ADVANCED UTILITY SIMULATION MODEL DOCUMENTATION OF SYSTEM DESIGN STATE LEVEL MODEL (VERSION 1.0)
The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...
The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...
Suborbital Research and Development Opportunities
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.
2011-01-01
This slide presentation reviews the new strategies for problem solving in the life sciences in the suborbital realm. Topics covered are: an overview of the space life sciences, the strategic initiatives that the Space Life Sciences organization engaged in, and the new business model that these initiatives were developed. Several opportunities for research are also reviewed.
Designing an Automated Assessment of Public Speaking Skills Using Multimodal Cues
ERIC Educational Resources Information Center
Chen, Lei; Feng, Gary; Leong, Chee Wee; Joe, Jilliam; Kitchen, Christopher; Lee, Chong Min
2016-01-01
Traditional assessments of public speaking skills rely on human scoring. We report an initial study on the development of an automated scoring model for public speaking performances using multimodal technologies. Task design, rubric development, and human rating were conducted according to standards in educational assessment. An initial corpus of…
NASA Astrophysics Data System (ADS)
Naboka, V. Yu.; Akkelin, S. V.; Karpenko, Iu. A.; Sinyukov, Yu. M.
2015-01-01
A key ingredient of hydrodynamical modeling of relativistic heavy ion collisions is thermal initial conditions, an input that is the consequence of a prethermal dynamics which is not completely understood yet. In the paper we employ a recently developed energy-momentum transport model of the prethermal stage to study influence of the alternative initial states in nucleus-nucleus collisions on flow and energy density distributions of the matter at the starting time of hydrodynamics. In particular, the dependence of the results on isotropic and anisotropic initial states is analyzed. It is found that at the thermalization time the transverse flow is larger and the maximal energy density is higher for the longitudinally squeezed initial momentum distributions. The results are also sensitive to the relaxation time parameter, equation of state at the thermalization time, and transverse profile of initial energy density distribution: Gaussian approximation, Glauber Monte Carlo profiles, etc. Also, test results ensure that the numerical code based on the energy-momentum transport model is capable of providing both averaged and fluctuating initial conditions for the hydrodynamic simulations of relativistic nuclear collisions.
A Tale of Two Trails: Exploring Different Paths to Success
Walker, Jennifer G.; Evenson, Kelly R.; Davis, William J.; Bors, Philip; Rodríguez, Daniel A.
2016-01-01
Background This comparative case study investigates 2 successful community trail initiatives, using the Active Living By Design (ALBD) Community Action Model as an analytical framework. The model includes 5 strategies: preparation, promotion, programs, policy, and physical projects. Methods Key stakeholders at 2 sites participated in in-depth interviews (N = 14). Data were analyzed for content using Atlas Ti and grouped according to the 5 strategies. Results Preparation Securing trail resources was challenging, but shared responsibilities facilitated trail development. Promotions The initiatives demonstrated minimal physical activity encouragement strategies. Programs Community stakeholders did not coordinate programmatic opportunities for routine physical activity. Policy Trails’ inclusion in regional greenway master plans contributed to trail funding and development. Policies that were formally institutionalized and enforced led to more consistent trail construction and safer conditions for users. Physical Projects Consistent standards for way finding signage and design safety features enhanced trail usability and safety. Conclusions Communities with different levels of government support contributed unique lessons to inform best practices of trail initiatives. This study revealed a disparity between trail development and use-encouragement strategies, which may limit trails’ impact on physical activity. The ALBD Community Action Model provided a viable framework to structure cross-disciplinary community trail initiatives. PMID:21597125
NASA Astrophysics Data System (ADS)
Pattanayak, Sujata; Mohanty, U. C.
2018-06-01
The paper intends to present the development of the extended weather research forecasting data assimilation (WRFDA) system in the framework of the non-hydrostatic mesoscale model core of weather research forecasting system (WRF-NMM), as an imperative aspect of numerical modeling studies. Though originally the WRFDA provides improved initial conditions for advanced research WRF, we have successfully developed a unified WRFDA utility that can be used by the WRF-NMM core, as well. After critical evaluation, it has been strategized to develop a code to merge WRFDA framework and WRF-NMM output. In this paper, we have provided a few selected implementations and initial results through single observation test, and background error statistics like eigenvalues, eigenvector and length scale among others, which showcase the successful development of extended WRFDA code for WRF-NMM model. Furthermore, the extended WRFDA system is applied for the forecast of three severe cyclonic storms: Nargis (27 April-3 May 2008), Aila (23-26 May 2009) and Jal (4-8 November 2010) formed over the Bay of Bengal. Model results are compared and contrasted within the analysis fields and later on with high-resolution model forecasts. The mean initial position error is reduced by 33% with WRFDA as compared to GFS analysis. The vector displacement errors in track forecast are reduced by 33, 31, 30 and 20% to 24, 48, 72 and 96 hr forecasts respectively, in data assimilation experiments as compared to control run. The model diagnostics indicates successful implementation of WRFDA within the WRF-NMM system.
Initial development of prototype performance model for highway design
DOT National Transportation Integrated Search
1997-12-01
The Federal Highway Administration (FHWA) has undertaken a multiyear project to develop the Interactive Highway Safety Design Model (IHSDM), which is a CADD-based integrated set of software tools to analyze a highway design to identify safety issues ...
Synthesizing SoTL Institutional Initiatives toward National Impact
ERIC Educational Resources Information Center
Simmons, Nicola
2016-01-01
This chapter draws on other authors' ideas in this issue, describing parallels and outlining distinctions toward a synthesized model for the development of SoTL initiatives at the institutional level and beyond.
The CAST Initiative in Guam: A Model of Effective Teachers Teaching Teachers
ERIC Educational Resources Information Center
Zuercher, Deborah K.; Kessler, Cristy; Yoshioka, Jon
2011-01-01
The CAST (content area specialized training) model of professional development enables sustainable teacher leadership and is responsive to the need for culturally relevant educational practices. The purpose of this paper is to share the background, methods, findings and recommendations of a case study on the CAST initiative in Guam. The case study…
Health Assessment and Readiness To Learn: An Interagency Collaborative Model.
ERIC Educational Resources Information Center
Hatfield, Maryellen Brown
Consistent with the first of the National Educational Goals 1990 (to ensure that all children in America will start school ready to learn by the year 2000), South Carolina initiated the Pre-School Health Appraisal Project (PSHAP), a collaborative practice model based on the experiences of 4 years of project development and findings. Initiated in…
Allocation model for air tanker initial attack in firefighting
Francis E. Greulich; William G. O' Regan
1975-01-01
Timely and appropriate use of air tankers in firefighting can bring high returns, but their misuse can be expensive when measured in operating and other costs. An allocation model has been developed for identifying superior strategies-for air tanker initial attack, and for choosing an optimum set of allocations among airbases. Data are presented for a representative...
The Lake Michigan Mass Balance Project (LMMBP) was initiated to support the development of a Lake Wide Management Plan (LaMP) for Lake Michigan. As one of the models in the LMMBP modeling framework, the Level 2 Lake Michigan containment transport and fate (LM2) model has been dev...
Development and initial test of the University of Wisconsin global isentropic-sigma model
NASA Technical Reports Server (NTRS)
Zapotocny, Tom H.; Johnson, Donald R.; Reames, Fred M.
1994-01-01
The description of a global version of the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) model and the results from an initial numerical weather prediction experiment are presented in this paper. The main objectives of this initial test are to (1) discuss theta-sigma model development and computer requirements, (2) demonstrate the ability of the UW theta-sigma model for global numerical weather prediction using realistic orography and parameterized physical processes, and (3) compare the transport of an inert trace constituent against a nominally 'identical' sigma coordinate model. Initial and verifying data for the 5-day simulations presented in this work were supplied by the Goddard Earth Observing System (GEOS-1) data assimilation system. The time period studied is 1-6 February 1985. This validation experiment demonstrates that the global UW theta-sigma model produces a realistic 5-day simulation of the mass and momentum distributions when compared to both the identical sigma model and GEOS-1 verification. Root-mean-square errors demonstrate that the theta-sigma model is slightly more accurate than the nominally identical sigma model with respect to standard synoptic variables. Of particular importance, the UW theta-sigma model displays a distinct advantage over the conventional sigma model with respect to the prognostic simulation of inert trace constituent transport in amplifying baroclinic waves of the extratropics. This is especially true in the upper troposphere and stratosphere where the spatial integrity and conservation of an inert trace constituent is severely compromised in the sigma model compared to the theta-sigma model.
Que, Jianwen
2016-01-01
The esophagus and trachea are tubular organs that initially share a single common lumen in the anterior foregut. Several models have been proposed to explain how this single-lumen developmental intermediate generates two tubular organs. However, new evidence suggests that these models are not comprehensive. I will first briefly review these models and then propose a novel ‘splitting and extension’ model based on our in vitro modeling of the foregut separation process. Signaling molecules (e.g., SHHs, WNTs, BMPs) and transcription factors (e.g., NKX2.1 and SOX2) are critical for the separation of the foregut. Intriguingly, some of these molecules continue to play essential roles during the transition of simple columnar into stratified squamous epithelium in the developing esophagus, and they are also closely involved in epithelial maintenance in the adults. Alterations in the levels of these molecules have been associated with the initiation and progression of several esophageal diseases and cancer in adults. PMID:25727889
The AGU Data Management Maturity Model Initiative
NASA Astrophysics Data System (ADS)
Bates, J. J.
2015-12-01
In September 2014, the AGU Board of Directors approved two initiatives to help the Earth and space sciences community address the growing challenges accompanying the increasing size and complexity of data. These initiatives are: 1) Data Science Credentialing: development of a continuing education and professional certification program to help scientists in their careers and to meet growing responsibilities and requirements around data science; and 2) Data Management Maturity (DMM) Model: development and implementation of a data management maturity model to assess process maturity against best practices, and to identify opportunities in organizational data management processes. Each of these has been organized within AGU as an Editorial Board and both Boards have held kick off meetings. The DMM model Editorial Board will recommend strategies for adapting and deploying a DMM model to the Earth and space sciences create guidance documents to assist in its implementation, and provide input on a pilot appraisal process. This presentation will provide an overview of progress to date in the DMM model Editorial Board and plans for work to be done over the upcoming year.
NCI-funded Cancer Model Development Centers (CMDCs) and the Human Cancer Models Initiative (HCMI) consortium members are creating as many as 1000 new cancer models. The models are being derived from many tumor subtypes, including rare and understudied cancers.
Integrated hydrologic modeling: Effects of spatial scale, discretization and initialization
NASA Astrophysics Data System (ADS)
Seck, A.; Welty, C.; Maxwell, R. M.
2011-12-01
Groundwater discharge contributes significantly to the annual flows of Chesapeake Bay tributaries and is presumed to contribute to the observed lag time between the implementation of management actions and the environmental response in the Chesapeake Bay. To investigate groundwater fluxes and flow paths and interaction with surface flow, we have developed a fully distributed integrated hydrologic model of the Chesapeake Bay Watershed using ParFlow. Here we present a comparison of model spatial resolution and initialization methods. We have studied the effect of horizontal discretization on overland flow processes at a range of scales. Three nested model domains have been considered: the Monocacy watershed (5600 sq. km), the Potomac watershed (92000 sq. km) and the Chesapeake Bay watershed (400,000 sq. km). Models with homogeneous subsurface and topographically-derived slopes were evaluated at 500-m, 1000-m, 2000-m, and 4000-m grid resolutions. Land surface slopes were derived from resampled DEMs and corrected using stream networks. Simulation results show that the overland flow processes are reasonably well represented with a resolution up to 2000 m. We observe that the effects of horizontal resolution dissipate with larger scale models. Using a homogeneous model that includes subsurface and surface terrain characteristics, we have evaluated various initialization methods for the integrated Monocacy watershed model. This model used several options for water table depths and two rainfall forcing methods including (1) a synthetic rainfall-recession cycle corresponding to the region's average annual rainfall rate, and (2) an initial shut-off of rainfall forcing followed by a rainfall-recession cycling. Results show the dominance of groundwater generated runoff during a first phase of the simulation followed by a convergence towards more balanced runoff generation mechanisms. We observe that the influence of groundwater runoff increases in dissected relief areas characterized by high slope magnitudes. This is due to the increase in initial water table gradients in these regions. As a result, in the domain conditions for this study, an initial shut-off of rainfall forcing proved to be the more efficient initialization method. The initialized model is then coupled with a Land Surface Model (CLM). Ongoing work includes coupling a heterogeneous subsurface field with spatially variable meteorological forcing using the National Land Data Assimilation System (NLDAS) data products. Seasonal trends of groundwater levels for current and pre-development conditions of the basin will be compared.
Jobs and Economic Development Impacts (Postcard)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-08-01
The U.S. Department of Energy's Wind Powering America initiative provides information on the Jobs and Economic Development Benefits model. This postcard is a marketing piece that stakeholders can provide to interested parties; it will guide them to the Jobs and Economic Development Benefits model section on the Wind Powering America website.
Examples of data assimilation in mesoscale models
NASA Technical Reports Server (NTRS)
Carr, Fred; Zack, John; Schmidt, Jerry; Snook, John; Benjamin, Stan; Stauffer, David
1993-01-01
The keynote address was the problem of physical initialization of mesoscale models. The classic purpose of physical or diabatic initialization is to reduce or eliminate the spin-up error caused by the lack, at the initial time, of the fully developed vertical circulations required to support regions of large rainfall rates. However, even if a model has no spin-up problem, imposition of observed moisture and heating rate information during assimilation can improve quantitative precipitation forecasts, especially early in the forecast. The two key issues in physical initialization are the choice of assimilating technique and sources of hydrologic/hydrometeor data. Another example of data assimilation in mesoscale models was presented in a series of meso-beta scale model experiments with and 11 km version of the MASS model designed to investigate the sensitivity of convective initiation forced by thermally direct circulations resulting from differential surface heating to four dimensional assimilation of surface and radar data. The results of these simulations underscore the need to accurately initialize and simulate grid and sub-grid scale clouds in meso- beta scale models. The status of the application of the CSU-RAMS mesoscale model by the NOAA Forecast Systems Lab for producing real-time forecasts with 10-60 km mesh resolutions over (4000 km)(exp 2) domains for use by the aviation community was reported. Either MAPS or LAPS model data are used to initialize the RAMS model on a 12-h cycle. The use of MAPS (Mesoscale Analysis and Prediction System) model was discussed. Also discussed was the mesobeta-scale data assimilation using a triply-nested nonhydrostatic version of the MM5 model.
2007-05-01
of the current project was to unpack and develop the concept of sensemaking, principally by developing and testing a cognitive model of the processes...themselves. In Year 2, new Cognitive Task Analysis data collection methods were developed and used to further test the model. Cognitive Task Analysis is a...2004) to examine the phenomenon of "sensemaking," a concept initially formulated by Weick (1995), but not developed from a cognitive perspective
Simulation of water-table aquifers using specified saturated thickness
Sheets, Rodney A.; Hill, Mary C.; Haitjema, Henk M.; Provost, Alden M.; Masterson, John P.
2014-01-01
Simulating groundwater flow in a water-table (unconfined) aquifer can be difficult because the saturated thickness available for flow depends on model-calculated hydraulic heads. It is often possible to realize substantial time savings and still obtain accurate head and flow solutions by specifying an approximate saturated thickness a priori, thus linearizing this aspect of the model. This specified-thickness approximation often relies on the use of the “confined” option in numerical models, which has led to confusion and criticism of the method. This article reviews the theoretical basis for the specified-thickness approximation, derives an error analysis for relatively ideal problems, and illustrates the utility of the approximation with a complex test problem. In the transient version of our complex test problem, the specified-thickness approximation produced maximum errors in computed drawdown of about 4% of initial aquifer saturated thickness even when maximum drawdowns were nearly 20% of initial saturated thickness. In the final steady-state version, the approximation produced maximum errors in computed drawdown of about 20% of initial aquifer saturated thickness (mean errors of about 5%) when maximum drawdowns were about 35% of initial saturated thickness. In early phases of model development, such as during initial model calibration efforts, the specified-thickness approximation can be a very effective tool to facilitate convergence. The reduced execution time and increased stability obtained through the approximation can be especially useful when many model runs are required, such as during inverse model calibration, sensitivity and uncertainty analyses, multimodel analysis, and development of optimal resource management scenarios.
A Gompertz population model with Allee effect and fuzzy initial values
NASA Astrophysics Data System (ADS)
Amarti, Zenia; Nurkholipah, Nenden Siti; Anggriani, Nursanti; Supriatna, Asep K.
2018-03-01
Growth and population dynamics models are important tools used in preparing a good management for society to predict the future of population or species. This has been done by various known methods, one among them is by developing a mathematical model that describes population growth. Models are usually formed into differential equations or systems of differential equations, depending on the complexity of the underlying properties of the population. One example of biological complexity is Allee effect. It is a phenomenon showing a high correlation between very small population size and the mean individual fitness of the population. In this paper the population growth model used is the Gompertz equation model by considering the Allee effect on the population. We explore the properties of the solution to the model numerically using the Runge-Kutta method. Further exploration is done via fuzzy theoretical approach to accommodate uncertainty of the initial values of the model. It is known that an initial value greater than the Allee threshold will cause the solution rises towards carrying capacity asymptotically. However, an initial value smaller than the Allee threshold will cause the solution decreases towards zero asymptotically, which means the population is eventually extinct. Numerical solutions show that modeling uncertain initial value of the critical point A (the Allee threshold) with a crisp initial value could cause the extinction of population of a certain possibilistic degree, depending on the predetermined membership function of the initial value.
NASA Astrophysics Data System (ADS)
Bradley, J. A.; Anesio, A. M.; Singarayer, J. S.; Heath, M. R.; Arndt, S.
2015-08-01
SHIMMER (Soil biogeocHemIcal Model for Microbial Ecosystem Response) is a new numerical modelling framework which is developed as part of an interdisciplinary, iterative, model-data based approach fully integrating fieldwork and laboratory experiments with model development, testing, and application. SHIMMER is designed to simulate the establishment of microbial biomass and associated biogeochemical cycling during the initial stages of ecosystem development in glacier forefield soils. However, it is also transferable to other extreme ecosystem types (such as desert soils or the surface of glaciers). The model mechanistically describes and predicts transformations in carbon, nitrogen and phosphorus through aggregated components of the microbial community as a set of coupled ordinary differential equations. The rationale for development of the model arises from decades of empirical observation on the initial stages of soil development in glacier forefields. SHIMMER enables a quantitative and process focussed approach to synthesising the existing empirical data and advancing understanding of microbial and biogeochemical dynamics. Here, we provide a detailed description of SHIMMER. The performance of SHIMMER is then tested in two case studies using published data from the Damma Glacier forefield in Switzerland and the Athabasca Glacier in Canada. In addition, a sensitivity analysis helps identify the most sensitive and unconstrained model parameters. Results show that the accumulation of microbial biomass is highly dependent on variation in microbial growth and death rate constants, Q10 values, the active fraction of microbial biomass, and the reactivity of organic matter. The model correctly predicts the rapid accumulation of microbial biomass observed during the initial stages of succession in the forefields of both the case study systems. Simulation results indicate that primary production is responsible for the initial build-up of substrate that subsequently supports heterotrophic growth. However, allochthonous contributions of organic matter are identified as important in sustaining this productivity. Microbial production in young soils is supported by labile organic matter, whereas carbon stocks in older soils are more refractory. Nitrogen fixing bacteria are responsible for the initial accumulation of available nitrates in the soil. Biogeochemical rates are highly seasonal, as observed in experimental data. The development and application of SHIMMER not only provides important new insights into forefield dynamics, but also highlights aspects of these systems that require further field and laboratory research. The most pressing advances need to come in quantifying nutrient budgets and biogeochemical rates, in exploring seasonality, the fate of allochthonous deposition in relation to autochthonous production, and empirical studies of microbial growth and cell death, to increase understanding of how glacier forefield development contributes to the global biogeochemical cycling and climate in the future.
Stability issues of nonlocal gravity during primordial inflation
NASA Astrophysics Data System (ADS)
Belgacem, Enis; Cusin, Giulia; Foffa, Stefano; Maggiore, Michele; Mancarella, Michele
2018-01-01
We study the cosmological evolution of some nonlocal gravity models, when the initial conditions are set during a phase of primordial inflation. We examine in particular three models, the so-called RT, RR and Δ4 models, previously introduced by our group. We find that, during inflation, the RT model has a viable background evolution, but at the level of cosmological perturbations develops instabilities that make it nonviable. In contrast, the RR and Δ4 models have a viable evolution even when their initial conditions are set during a phase of primordial inflation.
The Global Links Program: Building Pedagogy in Social Entrepreneurship for Positive Impact in Iraq
ERIC Educational Resources Information Center
Dato-on, Mary Conway; Al-Charaakh, Amel
2013-01-01
In this paper we offer a model that seeks to develop an entrepreneurial ecosystem as a portfolio approach to economic development through ongoing partnerships vs. one-off initiatives that may serve as a prototype for economic development in transitional economies. The model, developed by Tupperware Brands, Rollins College, and the U.S. Department…
ERIC Educational Resources Information Center
Warner, Greta J.; Fay, Doris; Spörer, Nadine
2017-01-01
Reading comprehension is a self-regulated activity that depends on the proactive effort of the reader. Therefore, the authors studied the effects of personal initiative (PI) on the development of reading comprehension, mediated by reading strategy knowledge. Structural equation modelling was applied to a longitudinal study with two data waves…
School Innovation in Science: Improving Science Teaching and Learning in Australian Schools
ERIC Educational Resources Information Center
Tytler, Russell
2009-01-01
School Innovation in Science is a major Victorian Government initiative that developed and validated a model whereby schools can improve their science teaching and learning. The initiative was developed and rolled out to more than 400 schools over the period 2000-2004. A research team worked with 200+ primary and secondary schools over three…
ERIC Educational Resources Information Center
Khoo, Yishin
2015-01-01
This paper examines the educational implications of two curriculum initiatives in China that have produced curricular materials promoting education for sustainable development (ESD) in minority-populated ethnic autonomous areas in China. The two curriculum projects present distinctive discourses, conceptions, models, frameworks and scopes of ESD…
We Think You Need a Vacation...: The Discipline Model at Fresh Youth Initiatives
ERIC Educational Resources Information Center
Afterschool Matters, 2003
2003-01-01
Fresh Youth Initiative (FYI) is a youth development organization based in the Washington Heights-Inwood section of Manhattan. The group's mission is to support and encourage the efforts of neighborhood young people and their families to design and carry out community service and social action projects, develop leadership skills, fulfill their…
On the Initiation Mechanism in Exploding Bridgewire and Laser Detonators
NASA Astrophysics Data System (ADS)
Stewart, D. Scott; Thomas, Keith A.; Clarke, S.; Mallett, H.; Martin, E.; Martinez, M.; Munger, A.; Saenz, Juan
2006-07-01
Since its invention by Los Alamos during the Manhattan Project era the exploding bridgewire detonator (EBW) has seen tremendous use and study. Recent development of a laser-powered device with detonation properties similar to an EBW is reviving interest in the basic physics of the deflagration-to-detonation (DDT) process in both of these devices. Cutback experiments using both laser interferometry and streak camera observations are providing new insight into the initiation mechanism in EBWs. These measurements are being correlated to a DDT model of compaction to detonation and shock to detonation developed previously by Xu and Stewart. The DDT model is incorporated into a high-resolution, multi-material model code for simulating the complete process. Model formulation and the modeling issues required to describe the test data will be discussed.
Rogers, Mary E; Glendon, A Ian
2018-01-01
This research reports on the 4-phase development of the 25-item Five-Factor Model Adolescent Personality Questionnaire (FFM-APQ). The purpose was to develop and determine initial evidence for validity of a brief adolescent personality inventory using a vocabulary that could be understood by adolescents up to 18 years old. Phase 1 (N = 48) consisted of item generation and expert (N = 5) review of items; Phase 2 (N = 179) involved item analyses; in Phase 3 (N = 496) exploratory factor analysis assessed the underlying structure; in Phase 4 (N = 405) confirmatory factor analyses resulted in a 25-item inventory with 5 subscales.
A participatory evaluation model for Healthier Communities: developing indicators for New Mexico.
Wallerstein, N
2000-01-01
Participatory evaluation models that invite community coalitions to take an active role in developing evaluations of their programs are a natural fit with Healthy Communities initiatives. The author describes the development of a participatory evaluation model for New Mexico's Healthier Communities program. She describes evaluation principles, research questions, and baseline findings. The evaluation model shows the links between process, community-level system impacts, and population health changes. PMID:10968754
NASA Astrophysics Data System (ADS)
Riyadi, Eko H.
2014-09-01
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.
Computational Modeling and Analysis of Insulin Induced Eukaryotic Translation Initiation
Lequieu, Joshua; Chakrabarti, Anirikh; Nayak, Satyaprakash; Varner, Jeffrey D.
2011-01-01
Insulin, the primary hormone regulating the level of glucose in the bloodstream, modulates a variety of cellular and enzymatic processes in normal and diseased cells. Insulin signals are processed by a complex network of biochemical interactions which ultimately induce gene expression programs or other processes such as translation initiation. Surprisingly, despite the wealth of literature on insulin signaling, the relative importance of the components linking insulin with translation initiation remains unclear. We addressed this question by developing and interrogating a family of mathematical models of insulin induced translation initiation. The insulin network was modeled using mass-action kinetics within an ordinary differential equation (ODE) framework. A family of model parameters was estimated, starting from an initial best fit parameter set, using 24 experimental data sets taken from literature. The residual between model simulations and each of the experimental constraints were simultaneously minimized using multiobjective optimization. Interrogation of the model population, using sensitivity and robustness analysis, identified an insulin-dependent switch that controlled translation initiation. Our analysis suggested that without insulin, a balance between the pro-initiation activity of the GTP-binding protein Rheb and anti-initiation activity of PTEN controlled basal initiation. On the other hand, in the presence of insulin a combination of PI3K and Rheb activity controlled inducible initiation, where PI3K was only critical in the presence of insulin. Other well known regulatory mechanisms governing insulin action, for example IRS-1 negative feedback, modulated the relative importance of PI3K and Rheb but did not fundamentally change the signal flow. PMID:22102801
ERIC Educational Resources Information Center
Chaparro, Erin A.; Smolkowski, Keith; Baker, Scott K.; Hanson, Natalie; Ryan-Jackson, Kathleen
2012-01-01
In the face of dwindling financial resources, educational leaders are looking to refine resource allocation while maintaining a focus on improved student outcomes. This article presents initial findings from a professional development state initiative called Effective Behavioral and Instructional Support Systems (EBISS). The EBISS initiative aims…
Development of full regeneration establishment models for the forest vegetation simulator
John D. Shaw
2015-01-01
For most simulation modeling efforts, the goal of model developers is to produce simulations that are the best representations of realism as possible. Achieving this goal commonly requires a considerable amount of data to set the initial parameters, followed by validation and model improvement â both of which require even more data. The Forest Vegetation Simulator (FVS...
Exploration of warm-up period in conceptual hydrological modelling
NASA Astrophysics Data System (ADS)
Kim, Kue Bum; Kwon, Hyun-Han; Han, Dawei
2018-01-01
One of the important issues in hydrological modelling is to specify the initial conditions of the catchment since it has a major impact on the response of the model. Although this issue should be a high priority among modelers, it has remained unaddressed by the community. The typical suggested warm-up period for the hydrological models has ranged from one to several years, which may lead to an underuse of data. The model warm-up is an adjustment process for the model to reach an 'optimal' state, where internal stores (e.g., soil moisture) move from the estimated initial condition to an 'optimal' state. This study explores the warm-up period of two conceptual hydrological models, HYMOD and IHACRES, in a southwestern England catchment. A series of hydrologic simulations were performed for different initial soil moisture conditions and different rainfall amounts to evaluate the sensitivity of the warm-up period. Evaluation of the results indicates that both initial wetness and rainfall amount affect the time required for model warm up, although it depends on the structure of the hydrological model. Approximately one and a half months are required for the model to warm up in HYMOD for our study catchment and climatic conditions. In addition, it requires less time to warm up under wetter initial conditions (i.e., saturated initial conditions). On the other hand, approximately six months is required for warm-up in IHACRES, and the wet or dry initial conditions have little effect on the warm-up period. Instead, the initial values that are close to the optimal value result in less warm-up time. These findings have implications for hydrologic model development, specifically in determining soil moisture initial conditions and warm-up periods to make full use of the available data, which is very important for catchments with short hydrological records.
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1990-01-01
The primary tasks during January 1990 to June 1990 have been the development and evaluation of various electron and electron-electronic energy equation models, the continued development of improved nonequilibrium radiation models for molecules and atoms, and the continued development and investigation of precursor models and their effects. In addition, work was initiated to develop a vibrational model for the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code. Also, an effort was started associated with the effects of including carbon species, say from an ablator, in the flowfield.
Facilitators of community participation in an Aboriginal sexual health promotion initiative.
Hulme Chambers, Alana; Tomnay, Jane; Stephens, Kylie; Crouch, Alan; Whiteside, Mary; Love, Pettina; McIntosh, Leonie; Waples Crowe, Peter
2018-04-01
Community participation is a collaborative process aimed at achieving community-identified outcomes. However, approaches to community participation within Aboriginal health promotion initiatives have been inconsistent and not well documented. Smart and Deadly was a community-led initiative to develop sexual health promotion resources with young Aboriginal people in regional Victoria, Australia. The principles of community-centred practice, authentic participatory processes and respect for the local cultural context guided the initiative. The aim of this article is to report factors that facilitated community participation undertaken in the Smart and Deadly initiative to inform future projects and provide further evidence in demonstrating the value of such approaches. A summative evaluation of the Smart and Deadly initiative was undertaken approximately 2 years after the initiative ended. Five focus groups and 13 interviews were conducted with a purposive sample of 32 participants who were involved with Smart and Deadly in one of the following ways: project participant, stakeholder or project partner, or project developer or designer. A deductive content analysis was undertaken and themes were compared to the YARN model, which was specifically created for planning and evaluating community participation strategies relating to Aboriginal sexual health promotion. A number of factors that facilitated community participation approaches used in Smart and Deadly were identified. The overarching theme was that trust was the foundation upon which the facilitators of community participation ensued. These facilitators were cultural safety and cultural literacy, community control, and legacy and sustainability. Whilst the YARN model was highly productive in identifying these facilitators of community participation, the model did not have provision for the element of trust between workers and community. Given the importance of trust between the project team and the Aboriginal community in the Smart and Deadly initiative, a suggested revision to the YARN model is that trust is included as the basis upon which YARN model factors are predicated. Adding trust to the YARN model as a basis upon which YARN model factors are grounded assists future Aboriginal health promotion projects in ensuring community participation approaches are more likely to be acceptable to the Aboriginal community.
A nonlinear model for analysis of slug-test data
McElwee, C.D.; Zenner, M.A.
1998-01-01
While doing slug tests in high-permeability aquifers, we have consistently seen deviations from the expected response of linear theoretical models. Normalized curves do not coincide for various initial heads, as would be predicted by linear theories, and are shifted to larger times for higher initial heads. We have developed a general nonlinear model based on the Navier-Stokes equation, nonlinear frictional loss, non-Darcian flow, acceleration effects, radius changes in the well bore, and a Hvorslev model for the aquifer, which explains these data features. The model produces a very good fit for both oscillatory and nonoscillatory field data, using a single set of physical parameters to predict the field data for various initial displacements at a given well. This is in contrast to linear models which have a systematic lack of fit and indicate that hydraulic conductivity varies with the initial displacement. We recommend multiple slug tests with a considerable variation in initial head displacement to evaluate the possible presence of nonlinear effects. Our conclusion is that the nonlinear model presented here is an excellent tool to analyze slug tests, covering the range from the underdamped region to the overdamped region.
Evidence-Based Leadership Development: The 4L Framework
ERIC Educational Resources Information Center
Scott, Shelleyann; Webber, Charles F.
2008-01-01
Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…
The U.S. EPA Atlantic Ecology Division (AED) has initiated a multi-year research program to develop empirical nitrogen load-response models for embayments in southern New England. This is part of a multi-regional effort to develop nutrient load-response models for the Gulf of Mex...
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr; Vlachos, Dionisios; Katsoulakis, Markos
2013-09-05
The overall objective of this project is to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals. Specific goals include: (i) Development of rigorous spatio-temporal coarse-grained kinetic Monte Carlo (KMC) mathematics and simulation for microscopic processes encountered in biomassmore » transformation. (ii) Development of hybrid multiscale simulation that links stochastic simulation to a deterministic partial differential equation (PDE) model for an entire reactor. (iii) Development of hybrid multiscale simulation that links KMC simulation with quantum density functional theory (DFT) calculations. (iv) Development of parallelization of models of (i)-(iii) to take advantage of Petaflop computing and enable real world applications of complex, multiscale models. In this NCE period, we continued addressing these objectives and completed the proposed work. Main initiatives, key results, and activities are outlined.« less
NASA Technical Reports Server (NTRS)
Manobianco, John; Uccellini, Louis W.; Brill, Keith F.; Kuo, Ying-Hwa
1992-01-01
A mesoscale numerical model is combined with a dynamic data assimilation via Newtonian relaxation, or 'nudging', to provide initial conditions for subsequent simulations of the QE II cyclone. Both the nudging technique and the inclusion of supplementary data are shown to have a large positive impact on the simulation of the QE II cyclone during the initial phase of rapid cyclone development. Within the initial development period (from 1200 to 1800 UTC 9 September 1978), the dynamic assimilation of operational and bogus data yields a coherent two-layer divergence pattern that is not well defined in the model run using only the operational data and static initialization. Diagnostic analysis based on the simulations show that the initial development of the QE II storm between 0000 UTC 9 September and 0000 UTC 10 September was embedded within an indirect circulation of an intense 300-hPa jet streak, was related to baroclinic processes extending throughout a deep portion of the troposphere, and was associated with a classic two-layer mass-divergence profile expected for an extratropical cyclone.
Model for assessment of the velocity and force at the start of sprint race.
Janjić, Nataša J; Kapor, Darko V; Doder, Dragan V; Petrović, Aleksandar; Jarić, Slobodan
2017-02-01
A mathematical model was developed for the assessment of the starting velocity and initial velocity and force of a 100-m sprint, based on a non-homogeneous differential equation with the air resistance proportional to the velocity, and the initial conditions for [Formula: see text], [Formula: see text]The use of this model requires the measurement of reaction time and segmental velocities over the course of the race. The model was validated by comparison with the data obtained from 100-m sprints of men: Carl Lewis (1988), Maurice Green (2001) and Usain Bolt (2009), and women: Florence Griffith-Joyner, Evelyn Ashford and Drechsler Heike (1988) showing a high level of agreement. Combined with the previous work of the authors, the present model allows for the assessment of important physical abilities, such as the exertion of a high starting force, development of high starting velocity and, later on, maximisation of the peak running velocity. These data could be of importance for practitioners to identify possible weaknesses and refine training methods for sprinters and other athletes whose performance depend on rapid movement initiations.
NASA Astrophysics Data System (ADS)
Maurer, Thomas; Caviedes-Voullième, Daniel; Hinz, Christoph; Gerke, Horst H.
2017-04-01
Landscapes that are heavily disturbed or newly formed by either natural processes or human activity are in a state of disequilibrium. Their initial development is thus characterized by highly dynamic processes under all climatic conditions. The primary distribution and structure of the solid phase (i.e. mineral particles forming the pore space) is one of the decisive factors for the development of hydrological behavior of the eco-hydrological system and therefore (co-) determining for its - more or less - stable final state. The artificially constructed ‚Hühnerwasser' catchment (a 6 ha area located in the open-cast lignite mine Welzow-Süd, southern Brandenburg, Germany) is a landscape laboratory where the initial eco-hydrological development is observed since 2005. The specific formation (or construction) processes generated characteristic sediment structures and distributions, resulting in a spatially heterogeneous initial state of the catchment. We developed a structure generator that simulates the characteristic distribution of the solid phase for such constructed landscapes. The program is able to generate quasi-realistic structures and sediment compositions on multiple spatial levels (1 cm up to 100 m scale). The generated structures can be i) conditioned to actual measurement values (e.g., soil texture and bulk distribution); ii) stochastically generated, and iii) calculated deterministically according to the geology and technical processes at the excavation site. Results are visualized using the GOCAD software package and the free software Paraview. Based on the 3D-spatial sediment distributions, effective hydraulic van-Genuchten parameters are calculated using pedotransfer functions. The hydraulic behavior of different sediment distribution (i.e. versions or variations of the catchment's porous body) is calculated using a numerical model developed by one of us (Caviedes-Voullième). Observation data are available from catchment monitoring are available for i) determining the boundary conditions (e.g., precipitation), and ii) the calibration / validation of the model (catchment discharge, ground water). The analysis of multiple sediment distribution scenarios should allow to approximately determine the influx of starting conditions on initial development of hydrological behavior. We present first flow modeling results for a reference (conditioned) catchment model and variations thereof. We will also give an outlook on further methodical development of our approach.
Hong, Ying; Liao, Hui; Raub, Steffen; Han, Joo Hun
2016-05-01
Building upon and extending Parker, Bindl, and Strauss's (2010) theory of proactive motivation, we develop an integrated, multilevel model to examine how contextual factors shape employees' proactive motivational states and, through these proactive motivational states, influence their personal initiative behavior. Using data from a sample of hotels collected from 3 sources and over 2 time periods, we show that establishment-level initiative-enhancing human resource management (HRM) systems were positively related to departmental initiative climate, which was positively related to employee personal initiative through employee role-breadth self-efficacy. Further, department-level empowering leadership was positively related to initiative climate only when initiative-enhancing HRM systems were low. These findings offer interesting implications for research on personal initiative and for the management of employee proactivity in organizations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Lee, Hyun-Chul; Kumar, Arun; Wang, Wanqiu
2018-03-01
Coupled prediction systems for seasonal and inter-annual variability in the tropical Pacific are initialized from ocean analyses. In ocean initial states, small scale perturbations are inevitably smoothed or distorted by the observational limits and data assimilation procedures, which tends to induce potential ocean initial errors for the El Nino-Southern Oscillation (ENSO) prediction. Here, the evolution and effects of ocean initial errors from the small scale perturbation on the developing phase of ENSO are investigated by an ensemble of coupled model predictions. Results show that the ocean initial errors at the thermocline in the western tropical Pacific grow rapidly to project on the first mode of equatorial Kelvin wave and propagate to the east along the thermocline. In boreal spring when the surface buoyancy flux weakens in the eastern tropical Pacific, the subsurface errors influence sea surface temperature variability and would account for the seasonal dependence of prediction skill in the NINO3 region. It is concluded that the ENSO prediction in the eastern tropical Pacific after boreal spring can be improved by increasing the observational accuracy of subsurface ocean initial states in the western tropical Pacific.
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...
2017-01-24
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
NASA Astrophysics Data System (ADS)
Tang, Tingting
In this dissertation, we develop structured population models to examine how changes in the environmental affect population processes. In Chapter 2, we develop a general continuous time size structured model describing a susceptible-infected (SI) population coupled with the environment. This model applies to problems arising in ecology, epidemiology, and cell biology. The model consists of a system of quasilinear hyperbolic partial differential equations coupled with a system of nonlinear ordinary differential equations that represent the environment. We develop a second-order high resolution finite difference scheme to numerically solve the model. Convergence of this scheme to a weak solution with bounded total variation is proved. We numerically compare the second order high resolution scheme with a first order finite difference scheme. Higher order of convergence and high resolution property are observed in the second order finite difference scheme. In addition, we apply our model to a multi-host wildlife disease problem, questions regarding the impact of the initial population structure and transition rate within each host are numerically explored. In Chapter 3, we use a stage structured matrix model for wildlife population to study the recovery process of the population given an environmental disturbance. We focus on the time it takes for the population to recover to its pre-event level and develop general formulas to calculate the sensitivity or elasticity of the recovery time to changes in the initial population distribution, vital rates and event severity. Our results suggest that the recovery time is independent of the initial population size, but is sensitive to the initial population structure. Moreover, it is more sensitive to the reduction proportion to the vital rates of the population caused by the catastrophe event relative to the duration of impact of the event. We present the potential application of our model to the amphibian population dynamic and the recovery of a certain plant population. In addition, we explore, in details, the application of the model to the sperm whale population in Gulf of Mexico after the Deepwater Horizon oil spill. In Chapter 4, we summarize the results from Chapter 2 and Chapter 3 and explore some further avenues of our research.
Life prediction and constitutive models for engine hot section anisotropic materials
NASA Technical Reports Server (NTRS)
Swanson, G. A.; Linask, I.; Nissley, D. M.; Norris, P. P.; Meyer, T. G.; Walker, K. P.
1987-01-01
The results are presented of a program designed to develop life prediction and constitutive models for two coated single crystal alloys used in gas turbine airfoils. The two alloys are PWA 1480 and Alloy 185. The two oxidation resistant coatings are PWA 273, an aluminide coating, and PWA 286, an overlay NiCoCrAlY coating. To obtain constitutive and fatigue data, tests were conducted on uncoated and coated specimens loaded in the CH76 100 CH110 , CH76 110 CH110 , CH76 111 CH110 and CH76 123 CH110 crystallographic directions. Two constitutive models are being developed and evaluated for the single crystal materials: a micromechanic model based on crystallographic slip systems, and a macroscopic model which employs anisotropic tensors to model inelastic deformation anisotropy. Based on tests conducted on the overlay coating material, constitutive models for coatings also appear feasible and two initial models were selected. A life prediction approach was proposed for coated single crystal materials, including crack initiation either in the coating or in the substrate. The coating initiated failures dominated in the tests at load levels typical of gas turbine operation. Coating life was related to coating stress/strain history which was determined from specimen data using the constitutive models.
Dynamic Modeling of Solar Dynamic Components and Systems
NASA Technical Reports Server (NTRS)
Hochstein, John I.; Korakianitis, T.
1992-01-01
The purpose of this grant was to support NASA in modeling efforts to predict the transient dynamic and thermodynamic response of the space station solar dynamic power generation system. In order to meet the initial schedule requirement of providing results in time to support installation of the system as part of the initial phase of space station, early efforts were executed with alacrity and often in parallel. Initially, methods to predict the transient response of a Rankine as well as a Brayton cycle were developed. Review of preliminary design concepts led NASA to select a regenerative gas-turbine cycle using a helium-xenon mixture as the working fluid and, from that point forward, the modeling effort focused exclusively on that system. Although initial project planning called for a three year period of performance, revised NASA schedules moved system installation to later and later phases of station deployment. Eventually, NASA selected to halt development of the solar dynamic power generation system for space station and to reduce support for this project to two-thirds of the original level.
The Development of a New Model of Solar EUV Irradiance Variability
NASA Technical Reports Server (NTRS)
Warren, Harry; Wagner, William J. (Technical Monitor)
2002-01-01
The goal of this research project is the development of a new model of solar EUV (Extreme Ultraviolet) irradiance variability. The model is based on combining differential emission measure distributions derived from spatially and spectrally resolved observations of active regions, coronal holes, and the quiet Sun with full-disk solar images. An initial version of this model was developed with earlier funding from NASA. The new version of the model developed with this research grant will incorporate observations from SoHO as well as updated compilations of atomic data. These improvements will make the model calculations much more accurate.
Posterior quantum dynamics for a continuous diffusion observation of a coherent channel
NASA Astrophysics Data System (ADS)
Dąbrowska, Anita; Staszewski, Przemysław
2012-11-01
We present the Belavkin filtering equation for the intense balanced heterodyne detection in a unitary model of an indirect observation. The measuring apparatus modelled by a Bose field is initially prepared in a coherent state and the observed process is a diffusion one. We prove that this filtering equation is relaxing: any initial square-integrable function tends asymptotically to a coherent state with an amplitude depending on the coupling constant and the initial state of the apparatus. The time-development of a squeezed coherent state is studied and compared with the previous results obtained for the measuring apparatus prepared initially in the vacuum state.
Improved atmosphere-ocean coupled modeling in the tropics for climate prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Minghua
2015-01-01
We investigated the initial development of the double ITCZ in the Community Climate System Model (CCSM Version 3) in the central Pacific. Starting from a resting initial condition of the ocean in January, the model developed a warm bias of sea-surface temperature (SST) in the central Pacific from 5oS to 10oS in the first three months. We found this initial bias to be caused by excessive surface shortwave radiation that is also present in the standalone atmospheric model. The initial bias is further amplified by biases in both surface latent heat flux and horizontal heat transport in the upper ocean.more » These biases are caused by the responses of surface winds to SST bias and the thermocline structure to surface wind curls. We also showed that the warming biases in surface solar radiation and latent heat fluxes are seasonally offset by cooling biases from reduced solar radiation after the austral summer due to cloud responses and in the austral fall due to enhanced evaporation when the maximum SST is closest to the equator. The warming biases from the dynamic heat transport by ocean currents however stay throughout all seasons once they are developed, which are eventually balanced by enhanced energy exchange and penetration of solar radiation below the mixed layer. Our results also showed that the equatorial cold tongue develops after the warm biases in the south central Pacific, and the overestimation of surface shortwave radiation recurs in the austral summer in each year.« less
Nashimoto, M; Mishima, Y
1988-01-01
Based on recent experimental data about transcription initiation and termination, a model for regulation of mammalian ribosomal DNA transcription is developed using a simple kinetic scheme. In this model, the existence of the transition pathway from the terminator to the promoter increases the rate of ribosomal RNA precursor synthesis. In addition to this 'non-transcribed spacer' traverse of RNA polymerase I, the co-ordination of initiation and termination allows a rapid on/off switch transition from the minimum to the maximum rate of ribosomal RNA precursor synthesis. Furthermore, taking account of the participation of two factors in the termination event, we propose a plausible molecular mechanism for the co-ordination of initiation and termination. This co-ordination is emphasized by repetition of the terminator unit. PMID:3223915
Initial Attempts at Developing Appropriate Human Relations Experiences for Potential Teachers.
ERIC Educational Resources Information Center
Calliotte, James A.
This paper traces the development of a human relations program as part of the teacher education curriculum at the University of Maryland Baltimore County. Four approaches are presented--a basic encounter model, a cognitive model, a programed unit, and a final integrated model that is now being employed in the teacher education program. Each model…
ERIC Educational Resources Information Center
Galisson, Kirsten; Brady, Kristin
2006-01-01
In May 2001, U.S. Secretary of State Colin Powell announced the establishment of the Global Development Alliance (GDA) as a key part of a new business model for the United States Agency for International Development (USAID). The GDA initiative aims to launch best practices in public-private partnerships around the world. The model is designed to…
A Supervisor-Targeted Implementation Approach to Promote System Change: The R3 Model.
Saldana, Lisa; Chamberlain, Patricia; Chapman, Jason
2016-11-01
Opportunities to evaluate strategies to create system-wide change in the child welfare system (CWS) and the resulting public health impact are rare. Leveraging a real-world, system-initiated effort to infuse the use of evidence-based principles throughout a CWS workforce, a pilot of the R 3 model and supervisor-targeted implementation approach is described. The development of R 3 and its associated fidelity monitoring was a collaboration between the CWS and model developers. Outcomes demonstrate implementation feasibility, strong fidelity scale measurement properties, improved supervisor fidelity over time, and the acceptability and perception of positive change by agency leadership. The value of system-initiated collaborations is discussed.
Towards the Prediction of Decadal to Centennial Climate Processes in the Coupled Earth System Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhengyu; Kutzbach, J.; Jacob, R.
2011-12-05
In this proposal, we have made major advances in the understanding of decadal and long term climate variability. (a) We performed a systematic study of multidecadal climate variability in FOAM-LPJ and CCSM-T31, and are starting exploring decadal variability in the IPCC AR4 models. (b) We develop several novel methods for the assessment of climate feedbacks in the observation. (c) We also developed a new initialization scheme DAI (Dynamical Analogue Initialization) for ensemble decadal prediction. (d) We also studied climate-vegetation feedback in the observation and models. (e) Finally, we started a pilot program using Ensemble Kalman Filter in CGCM for decadalmore » climate prediction.« less
Initiating Formal Requirements Specifications with Object-Oriented Models
NASA Technical Reports Server (NTRS)
Ampo, Yoko; Lutz, Robyn R.
1994-01-01
This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.
Creating an In-School Pastoral System for Student Teachers in School-Based Initial Teacher Education
ERIC Educational Resources Information Center
Philpott, Carey
2015-01-01
Recent developments in initial teacher education (ITE) have produced a number of school-centred models. These mean that student teachers may now spend more of their time in schools than has historically been the case. In some of these models, student teachers are more clearly part of the school as an institution than might be the case in more…
Stability of general-relativistic accretion disks
NASA Astrophysics Data System (ADS)
Korobkin, Oleg; Abdikamalov, Ernazar B.; Schnetter, Erik; Stergioulas, Nikolaos; Zink, Burkhard
2011-02-01
Self-gravitating relativistic disks around black holes can form as transient structures in a number of astrophysical scenarios such as binary neutron star and black hole-neutron star coalescences, as well as the core collapse of massive stars. We explore the stability of such disks against runaway and nonaxisymmetric instabilities using three-dimensional hydrodynamics simulations in full general relativity using the Thor code. We model the disk matter using the ideal fluid approximation with a Γ-law equation of state with Γ=4/3. We explore three disk models around nonrotating black holes with disk-to-black hole mass ratios of 0.24, 0.17, and 0.11. Because of metric blending in our initial data, all of our initial models contain an initial axisymmetric perturbation which induces radial disk oscillations. Despite these oscillations, our models do not develop the runaway instability during the first several orbital periods. Instead, all of the models develop unstable nonaxisymmetric modes on a dynamical time scale. We observe two distinct types of instabilities: the Papaloizou-Pringle and the so-called intermediate type instabilities. The development of the nonaxisymmetric mode with azimuthal number m=1 is accompanied by an outspiraling motion of the black hole, which significantly amplifies the growth rate of the m=1 mode in some cases. Overall, our simulations show that the properties of the unstable nonaxisymmetric modes in our disk models are qualitatively similar to those in the Newtonian theory.
Interactive graphic editing tools in bioluminescent imaging simulation
NASA Astrophysics Data System (ADS)
Li, Hui; Tian, Jie; Luo, Jie; Wang, Ge; Cong, Wenxiang
2005-04-01
It is a challenging task to accurately describe complicated biological tissues and bioluminescent sources in bioluminescent imaging simulation. Several graphic editing tools have been developed to efficiently model each part of the bioluminescent simulation environment and to interactively correct or improve the initial models of anatomical structures or bioluminescent sources. There are two major types of graphic editing tools: non-interactive tools and interactive tools. Geometric building blocks (i.e. regular geometric graphics and superquadrics) are applied as non-interactive tools. To a certain extent, complicated anatomical structures and bioluminescent sources can be approximately modeled by combining a sufficient large number of geometric building blocks with Boolean operators. However, those models are too simple to describe the local features and fine changes in 2D/3D irregular contours. Therefore, interactive graphic editing tools have been developed to facilitate the local modifications of any initial surface model. With initial models composed of geometric building blocks, interactive spline mode is applied to conveniently perform dragging and compressing operations on 2D/3D local surface of biological tissues and bioluminescent sources inside the region/volume of interest. Several applications of the interactive graphic editing tools will be presented in this article.
Process Model of A Fusion Fuel Recovery System for a Direct Drive IFE Power Reactor
NASA Astrophysics Data System (ADS)
Natta, Saswathi; Aristova, Maria; Gentile, Charles
2008-11-01
A task has been initiated to develop a detailed representative model for the fuel recovery system (FRS) in the prospective direct drive inertial fusion energy (IFE) reactor. As part of the conceptual design phase of the project, a chemical process model is developed in order to observe the interaction of system components. This process model is developed using FEMLAB Multiphysics software with the corresponding chemical engineering module (CEM). Initially, the reactants, system structure, and processes are defined using known chemical species of the target chamber exhaust. Each step within the Fuel recovery system is modeled compartmentally and then merged to form the closed loop fuel recovery system. The output, which includes physical properties and chemical content of the products, is analyzed after each step of the system to determine the most efficient and productive system parameters. This will serve to attenuate possible bottlenecks in the system. This modeling evaluation is instrumental in optimizing and closing the fusion fuel cycle in a direct drive IFE power reactor. The results of the modeling are presented in this paper.
A simplified model for the assessment of the impact probability of fragments.
Gubinelli, Gianfilippo; Zanelli, Severino; Cozzani, Valerio
2004-12-31
A model was developed for the assessment of fragment impact probability on a target vessel, following the collapse and fragmentation of a primary vessel due to internal pressure. The model provides the probability of impact of a fragment with defined shape, mass and initial velocity on a target of a known shape and at a given position with respect to the source point. The model is based on the ballistic analysis of the fragment trajectory and on the determination of impact probabilities by the analysis of initial direction of fragment flight. The model was validated using available literature data.
Application of remote sensing for prediction and detection of thermal pollution, phase 2
NASA Technical Reports Server (NTRS)
Veziroglu, T. N.; Lee, S. S.
1975-01-01
The development of a predictive mathematical model for thermal pollution in connection with remote sensing measurements was continued. A rigid-lid model has been developed and its application to far-field study has been completed. The velocity and temperature fields have been computed for different atmospheric conditions and for different boundary currents produced by tidal effects. In connection with the theoretical work, six experimental studies of the two sites in question (Biscayne Bay site and Hutchinson Island site) have been carried out. The temperature fields obtained during the tests at the Biscayne Bay site have been compared with the predictions of the rigid-lid model and these results are encouraging. The rigid-lid model is also being applied to near-field study. Preliminary results for a simple case have been obtained and execution of more realistic cases has been initiated. The development of a free-surface model also been initiated. The governing equations have been formulated and the computer programs have been written.
ADVANCED UTILITY SIMULATION MODEL, DESCRIPTION OF THE NATIONAL LOOP (VERSION 3.0)
The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...
Making Progress Toward Graduation: Evidence from the Talent Development High School Model
ERIC Educational Resources Information Center
Kemple, James J.; Herlihy, Corinne M.; Smith, Thomas J.
2005-01-01
In low-performing public high schools in U.S. cities, high proportions of students drop out, students who stay in school typically do not succeed academically, and efforts to make substantial reforms often meet with little success. The Talent Development High School model is a comprehensive school reform initiative that has been developed to…
Long-Boyle, Janel R; Savic, Rada; Yan, Shirley; Bartelink, Imke; Musick, Lisa; French, Deborah; Law, Jason; Horn, Biljana; Cowan, Morton J; Dvorak, Christopher C
2015-04-01
Population pharmacokinetic (PK) studies of busulfan in children have shown that individualized model-based algorithms provide improved targeted busulfan therapy when compared with conventional dose guidelines. The adoption of population PK models into routine clinical practice has been hampered by the tendency of pharmacologists to develop complex models too impractical for clinicians to use. The authors aimed to develop a population PK model for busulfan in children that can reliably achieve therapeutic exposure (concentration at steady state) and implement a simple model-based tool for the initial dosing of busulfan in children undergoing hematopoietic cell transplantation. Model development was conducted using retrospective data available in 90 pediatric and young adult patients who had undergone hematopoietic cell transplantation with busulfan conditioning. Busulfan drug levels and potential covariates influencing drug exposure were analyzed using the nonlinear mixed effects modeling software, NONMEM. The final population PK model was implemented into a clinician-friendly Microsoft Excel-based tool and used to recommend initial doses of busulfan in a group of 21 pediatric patients prospectively dosed based on the population PK model. Modeling of busulfan time-concentration data indicates that busulfan clearance displays nonlinearity in children, decreasing up to approximately 20% between the concentrations of 250-2000 ng/mL. Important patient-specific covariates found to significantly impact busulfan clearance were actual body weight and age. The percentage of individuals achieving a therapeutic concentration at steady state was significantly higher in subjects receiving initial doses based on the population PK model (81%) than in historical controls dosed on conventional guidelines (52%) (P = 0.02). When compared with the conventional dosing guidelines, the model-based algorithm demonstrates significant improvement for providing targeted busulfan therapy in children and young adults.
Traffic & safety statewide model and GIS modeling.
DOT National Transportation Integrated Search
2012-07-01
Several steps have been taken over the past two years to advance the Utah Department of Transportation (UDOT) safety initiative. Previous research projects began the development of a hierarchical Bayesian model to analyze crashes on Utah roadways. De...
Power Management and Distribution (PMAD) Model Development: Final Report
NASA Technical Reports Server (NTRS)
Metcalf, Kenneth J.
2011-01-01
Power management and distribution (PMAD) models were developed in the early 1990's to model candidate architectures for various Space Exploration Initiative (SEI) missions. They were used to generate "ballpark" component mass estimates to support conceptual PMAD system design studies. The initial set of models was provided to NASA Lewis Research Center (since renamed Glenn Research Center) in 1992. They were developed to estimate the characteristics of power conditioning components predicted to be available in the 2005 timeframe. Early 90's component and device designs and material technologies were projected forward to the 2005 timeframe, and algorithms reflecting those design and material improvements were incorporated into the models to generate mass, volume, and efficiency estimates for circa 2005 components. The models are about ten years old now and NASA GRC requested a review of them to determine if they should be updated to bring them into agreement with current performance projections or to incorporate unforeseen design or technology advances. This report documents the results of this review and the updated power conditioning models and new transmission line models generated to estimate post 2005 PMAD system masses and sizes. This effort continues the expansion and enhancement of a library of PMAD models developed to allow system designers to assess future power system architectures and distribution techniques quickly and consistently.
Optimal flight initiation distance.
Cooper, William E; Frederick, William G
2007-01-07
Decisions regarding flight initiation distance have received scant theoretical attention. A graphical model by Ydenberg and Dill (1986. The economics of fleeing from predators. Adv. Stud. Behav. 16, 229-249) that has guided research for the past 20 years specifies when escape begins. In the model, a prey detects a predator, monitors its approach until costs of escape and of remaining are equal, and then flees. The distance between predator and prey when escape is initiated (approach distance = flight initiation distance) occurs where decreasing cost of remaining and increasing cost of fleeing intersect. We argue that prey fleeing as predicted cannot maximize fitness because the best prey can do is break even during an encounter. We develop two optimality models, one applying when all expected future contribution to fitness (residual reproductive value) is lost if the prey dies, the other when any fitness gained (increase in expected RRV) during the encounter is retained after death. Both models predict optimal flight initiation distance from initial expected fitness, benefits obtainable during encounters, costs of escaping, and probability of being killed. Predictions match extensively verified predictions of Ydenberg and Dill's (1986) model. Our main conclusion is that optimality models are preferable to break-even models because they permit fitness maximization, offer many new testable predictions, and allow assessment of prey decisions in many naturally occurring situations through modification of benefit, escape cost, and risk functions.
My name is Caitlyn Barrett and I am the Scientific Program Manager for the Human Cancer Model Initiative (HCMI) in the Office of Cancer Genomics (OCG). In my role within the HCMI, I am helping to establish communication pathways and build the foundation for collaboration that will enable the completion of the Initiative’s aim to develop as many as 1000 next-generation cancer models, established from patient tumors and accompanied by clinical and molecular data.
Performance and Weight Estimates for an Advanced Open Rotor Engine
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.; Tong, Michael T.
2012-01-01
NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
NASA Technical Reports Server (NTRS)
Schonberg, William P.; Mohamed, Essam
1997-01-01
This report presents the results of a study whose objective was to develop first-principles-based models of hole size and maximum tip-to-tip crack length for a spacecraft module pressure wall that has been perforated in an orbital debris particle impact. The hole size and crack length models are developed by sequentially characterizing the phenomena comprising the orbital debris impact event, including the initial impact, the creation and motion of a debris cloud within the dual-wall system, the impact of the debris cloud on the pressure wall, the deformation of the pressure wall due to debris cloud impact loading prior to crack formation, pressure wall crack initiation, propagation, and arrest, and finally pressure wall deformation following crack initiation and growth. The model development has been accomplished through the application of elementary shock physics and thermodynamic theory, as well as the principles of mass, momentum, and energy conservation. The predictions of the model developed herein are compared against the predictions of empirically-based equations for hole diameters and maximum tip-to-tip crack length for three International Space Station wall configurations. The ISS wall systems considered are the baseline U.S. Lab Cylinder, the enhanced U.S. Lab Cylinder, and the U.S. Lab Endcone. The empirical predictor equations were derived from experimentally obtained hole diameters and crack length data. The original model predictions did not compare favorably with the experimental data, especially for cases in which pressure wall petalling did not occur. Several modifications were made to the original model to bring its predictions closer in line with the experimental results. Following the adjustment of several empirical constants, the predictions of the modified analytical model were in much closer agreement with the experimental results.
Modelling of the hole-initiated impact ionization current in the framework of hydrodynamic equations
NASA Astrophysics Data System (ADS)
Lorenzini, Martino; Van Houdt, Jan
2002-02-01
Several research papers have shown the feasibility of the hydrodynamic transport model to investigate impact ionization in semiconductor devices by means of mean-energy-dependent generation rates. However, the analysis has been usually carried out for the case of the electron-initiated impact ionization process and less attention has been paid to the modelling of the generation rate due to impact ionization events initiated by holes. This paper therefore presents an original model for the hole-initiated impact ionization in silicon and validates it by comparing simulation results with substrate currents taken from p-channel transistors manufactured in a 0.35 μm CMOS technology having three different channel lengths. The experimental data are successfully reproduced over a wide range of applied voltages using only one fitting parameter. Since the impact ionization of holes triggers the mechanism responsible for the back-bias enhanced gate current in deep submicron nMOS devices, the model can be exploited in the development of non-volatile memories programmed by secondary electron injection.
NASA Technical Reports Server (NTRS)
Halford, Gary R.
1993-01-01
The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design. Recently, two of the methods were transcribed into computer software for use with personal computers.
NASA Astrophysics Data System (ADS)
Halford, Gary R.
1993-10-01
The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design.
Life modeling of thermal barrier coatings for aircraft gas turbine engines
NASA Technical Reports Server (NTRS)
Miller, Robert A.
1988-01-01
Thermal barrier coating life models developed under the NASA Lewis Research Center's Hot Section Technology (HOST) program are summarized. An initial laboratory model and three design-capable models are discussed. Current understanding of coating failure mechanisms are also summarized.
RF models for plasma-surface interactions
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Smithe, David; Lin, Ming-Chieh; Kruger, Scott; Stoltz, Peter
2013-09-01
Computational models for DC and oscillatory (RF-driven) sheath potentials, arising at metal or dielectric-coated surfaces in contact with plasma, are developed within the VSim code and applied in parameter regimes characteristic of fusion plasma experiments and plasma processing scenarios. Results from initial studies quantifying the effects of various dielectric wall coating materials and thicknesses on these sheath potentials, as well as on the ensuing flux of plasma particles to the wall, are presented. As well, the developed models are used to model plasma-facing ICRF antenna structures in the ITER device; we present initial assessments of the efficacy of dielectric-coated antenna surfaces in reducing sputtering-induced high-Z impurity contamination of the fusion reaction. Funded by U.S. DoE via a Phase I SBIR grant, award DE-SC0009501.
Initiation-promotion model of tumor prevalence in mice from space radiation exposures
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Wilson, J. W.
1995-01-01
Exposures in space consist of low-level background components from galactic cosmic rays (GCR), occasional intense-energetic solar-particle events, periodic passes through geomagnetic-trapped radiation, and exposure from possible onboard nuclear-propulsion engines. Risk models for astronaut exposure from such diverse components and modalities must be developed to assure adequate protection in future NASA missions. The low-level background exposures (GCR), including relativistic heavy ions (HZE), will be the ultimate limiting factor for astronaut career exposure. We consider herein a two-mutation, initiation-promotion, radiation-carcinogenesis model in mice in which the initiation stage is represented by a linear kinetics model of cellular repair/misrepair, including the track-structure model for heavy ion action cross-sections. The model is validated by comparison with the harderian gland tumor experiments of Alpen et al. for various ion beams. We apply the initiation-promotion model to exposures from galactic cosmic rays, using models of the cosmic-ray environment and heavy ion transport, and consider the effects of the age of the mice prior to and after the exposure and of the length of time in space on predictions of relative risk. Our results indicate that biophysical models of age-dependent radiation hazard will provide a better understanding of GCR risk than models that rely strictly on estimates of the initial slopes of these radiations.
Laser diode initiated detonators for space applications
NASA Technical Reports Server (NTRS)
Ewick, David W.; Graham, J. A.; Hawley, J. D.
1993-01-01
Ensign Bickford Aerospace Company (EBAC) has over ten years of experience in the design and development of laser ordnance systems. Recent efforts have focused on the development of laser diode ordnance systems for space applications. Because the laser initiated detonators contain only insensitive secondary explosives, a high degree of system safety is achieved. Typical performance characteristics of a laser diode initiated detonator are described in this paper, including all-fire level, function time, and output. A finite difference model used at EBAC to predict detonator performance, is described and calculated results are compared to experimental data. Finally, the use of statistically designed experiments to evaluate performance of laser initiated detonators is discussed.
NASA Astrophysics Data System (ADS)
Pei, Jin-Song; Mai, Eric C.
2007-04-01
This paper introduces a continuous effort towards the development of a heuristic initialization methodology for constructing multilayer feedforward neural networks to model nonlinear functions. In this and previous studies that this work is built upon, including the one presented at SPIE 2006, the authors do not presume to provide a universal method to approximate arbitrary functions, rather the focus is given to the development of a rational and unambiguous initialization procedure that applies to the approximation of nonlinear functions in the specific domain of engineering mechanics. The applications of this exploratory work can be numerous including those associated with potential correlation and interpretation of the inner workings of neural networks, such as damage detection. The goal of this study is fulfilled by utilizing the governing physics and mathematics of nonlinear functions and the strength of the sigmoidal basis function. A step-by-step graphical procedure utilizing a few neural network prototypes as "templates" to approximate commonly seen memoryless nonlinear functions of one or two variables is further developed in this study. Decomposition of complex nonlinear functions into a summation of some simpler nonlinear functions is utilized to exploit this prototype-based initialization methodology. Training examples are presented to demonstrate the rationality and effciency of the proposed methodology when compared with the popular Nguyen-Widrow initialization algorithm. Future work is also identfied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id
2014-09-30
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logicmore » model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.« less
Kim, Ji-Hoon; Kang, Wee-Soo; Yun, Sung-Chul
2014-06-01
A population model of bacterial spot caused by Xanthomonas campestris pv. vesicatoria on hot pepper was developed to predict the primary disease infection date. The model estimated the pathogen population on the surface and within the leaf of the host based on the wetness period and temperature. For successful infection, at least 5,000 cells/ml of the bacterial population were required. Also, wind and rain were necessary according to regression analyses of the monitored data. Bacterial spot on the model is initiated when the pathogen population exceeds 10(15) cells/g within the leaf. The developed model was validated using 94 assessed samples from 2000 to 2007 obtained from monitored fields. Based on the validation study, the predicted initial infection dates varied based on the year rather than the location. Differences in initial infection dates between the model predictions and the monitored data in the field were minimal. For example, predicted infection dates for 7 locations were within the same month as the actual infection dates, 11 locations were within 1 month of the actual infection, and only 3 locations were more than 2 months apart from the actual infection. The predicted infection dates were mapped from 2009 to 2012; 2011 was the most severe year. Although the model was not sensitive enough to predict disease severity of less than 0.1% in the field, our model predicted bacterial spot severity of 1% or more. Therefore, this model can be applied in the field to determine when bacterial spot control is required.
Kim, Ji-Hoon; Kang, Wee-Soo; Yun, Sung-Chul
2014-01-01
A population model of bacterial spot caused by Xanthomonas campestris pv. vesicatoria on hot pepper was developed to predict the primary disease infection date. The model estimated the pathogen population on the surface and within the leaf of the host based on the wetness period and temperature. For successful infection, at least 5,000 cells/ml of the bacterial population were required. Also, wind and rain were necessary according to regression analyses of the monitored data. Bacterial spot on the model is initiated when the pathogen population exceeds 1015 cells/g within the leaf. The developed model was validated using 94 assessed samples from 2000 to 2007 obtained from monitored fields. Based on the validation study, the predicted initial infection dates varied based on the year rather than the location. Differences in initial infection dates between the model predictions and the monitored data in the field were minimal. For example, predicted infection dates for 7 locations were within the same month as the actual infection dates, 11 locations were within 1 month of the actual infection, and only 3 locations were more than 2 months apart from the actual infection. The predicted infection dates were mapped from 2009 to 2012; 2011 was the most severe year. Although the model was not sensitive enough to predict disease severity of less than 0.1% in the field, our model predicted bacterial spot severity of 1% or more. Therefore, this model can be applied in the field to determine when bacterial spot control is required. PMID:25288995
Merritt, E. C.; Doss, F. W.; Loomis, E. N.; ...
2015-06-24
Counter-propagating shear experiments conducted at the OMEGA Laser Facility have been evaluating the effect of target initial conditions, specifically the characteristics of a tracer foil located at the shear boundary, on Kelvin-Helmholtz instability evolution and experiment transition toward nonlinearity and turbulence in the high-energy-density (HED) regime. Experiments are focused on both identifying and uncoupling the dependence of the model initial turbulent length scale in variable-density turbulence models of k-ϵ type on competing physical instability seed lengths as well as developing a path toward fully developed turbulent HED experiments. We present results from a series of experiments controllably and independently varyingmore » two initial types of scale lengths in the experiment: the thickness and surface roughness (surface perturbation scale spectrum) of a tracer layer at the shear interface. We show that decreasing the layer thickness and increasing the surface roughness both have the ability to increase the relative mixing in the system, and thus theoretically decrease the time required to begin transitioning to turbulence in the system. In addition, we also show that we can connect a change in observed mix width growth due to increased foil surface roughness to an analytically predicted change in model initial turbulent scale lengths.« less
Bruner-Tran, Kaylon L.; Mokshagundam, Shilpa; Herington, Jennifer L.; Ding, Tianbing; Osteen, Kevin G.
2018-01-01
Background: Although it has been more than a century since endometriosis was initially described in the literature, understanding the etiology and natural history of the disease has been challenging. However, the broad utility of murine and rat models of experimental endometriosis has enabled the elucidation of a number of potentially targetable processes which may otherwise promote this disease. Objective: To review a variety of studies utilizing rodent models of endometriosis to illustrate their utility in examining mechanisms associated with development and progression of this disease. Results: Use of rodent models of endometriosis has provided a much broader understanding of the risk factors for the initial development of endometriosis, the cellular pathology of the disease and the identification of potential therapeutic targets. Conclusion: Although there are limitations with any animal model, the variety of experimental endometriosis models that have been developed has enabled investigation into numerous aspects of this disease. Thanks to these models, our under-standing of the early processes of disease development, the role of steroid responsiveness, inflammatory processes and the peritoneal environment has been advanced. More recent models have begun to shed light on how epigenetic alterations con-tribute to the molecular basis of this disease as well as the multiple comorbidities which plague many patients. Continued de-velopments of animal models which aid in unraveling the mechanisms of endometriosis development provide the best oppor-tunity to identify therapeutic strategies to prevent or regress this enigmatic disease.
24 CFR 3285.3 - Alterations during initial installation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.3 Alterations... Model Installation Standards, the MHCSS (24 CFR part 3280) and the Manufactured Home Procedural and...
24 CFR 3285.3 - Alterations during initial installation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.3 Alterations... Model Installation Standards, the MHCSS (24 CFR part 3280) and the Manufactured Home Procedural and...
24 CFR 3285.3 - Alterations during initial installation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.3 Alterations... Model Installation Standards, the MHCSS (24 CFR part 3280) and the Manufactured Home Procedural and...
Model development and applications at the USDA-ARS National Soil Erosion Research Laboratory
USDA-ARS?s Scientific Manuscript database
The United States Department of Agriculture (USDA) has a long history of development of soil erosion prediction technology, initially with empirical equations like the Universal Soil Loss Equation (USLE), and more recently with process-based models such as the Water Erosion Prediction Project (WEPP)...
The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...
Net present value approaches for drug discovery.
Svennebring, Andreas M; Wikberg, Jarl Es
2013-12-01
Three dedicated approaches to the calculation of the risk-adjusted net present value (rNPV) in drug discovery projects under different assumptions are suggested. The probability of finding a candidate drug suitable for clinical development and the time to the initiation of the clinical development is assumed to be flexible in contrast to the previously used models. The rNPV of the post-discovery cash flows is calculated as the probability weighted average of the rNPV at each potential time of initiation of clinical development. Practical considerations how to set probability rates, in particular during the initiation and termination of a project is discussed.
Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA
NASA Technical Reports Server (NTRS)
Gupta, Garima
2011-01-01
NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.
Bundle Payment Program Initiative: Roles of a Nurse Navigator and Home Health Professionals.
Peiritsch, Heather
2017-06-01
With the passage of the Affordable Care Act, The Centers for Medicare and Medicaid (CMS) introduced a new value-based payment model, the Bundle Payment Care Initiative. The CMS Innovation (Innovation Center) authorized hospitals to participate in a pilot to test innovative payment and service delivery models that have a potential to reduce Medicare expenditures while maintaining or improving the quality of care for beneficiaries. A hospital-based home care agency, Abington Jefferson Health Home Care Department, led the initiative for the development and implementation of the Bundled Payment Program. This was a creative and innovative method to improve care along the continuum while testing a value-based care model.
NASA Technical Reports Server (NTRS)
Loyselle, Patricia; Prokopius, Kevin
2011-01-01
Proton Exchange Membrane (PEM) fuel cell technology is the leading candidate to replace the alkaline fuel cell technology, currently used on the Shuttle, for future space missions. During a 5-yr development program, a PEM fuel cell powerplant was developed. This report details the initial performance evaluation test results of the powerplant.
ERIC Educational Resources Information Center
Herlihy, Corinne M.; Kemple, James J.
2004-01-01
The Talent Development Middle School model was created to make a difference in struggling urban middle schools. The model is part of a trend in school improvement strategies whereby whole-school reform projects aim to improve performance and attendance outcomes for students through the use of major changes in both the organizational structure and…
ERIC Educational Resources Information Center
Perla, Rocco J.; Carifio, James
2011-01-01
Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…
24 CFR 3285.1 - Administration.
Code of Federal Regulations, 2012 CFR
2012-04-01
... DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.1 Administration. (a) Scope. These Model Installation Standards provide minimum requirements for the initial installation of new... performing a specific operation or assembly, will be deemed to comply with these Model Installation Standards...
24 CFR 3285.1 - Administration.
Code of Federal Regulations, 2011 CFR
2011-04-01
... DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.1 Administration. (a) Scope. These Model Installation Standards provide minimum requirements for the initial installation of new... performing a specific operation or assembly, will be deemed to comply with these Model Installation Standards...
Predicting Ice Sheet and Climate Evolution at Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heimbach, Patrick
2016-02-06
A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less
Modeling the effect of initial and free-stream conditions on circular wakes
NASA Astrophysics Data System (ADS)
Lewalle, Jacques
A cascade-transport model is applied to study the effect of initial and free-stream conditions on circular waves. The role of the very-large-eddies (VLEs) is shown and used to derive a new understanding of wakes and their lack of universality. Computational results are reported which show that the VLEs are a determining factor in the development of self-preserving solutions for the axisymmetric wake.
Predicting the future prevalence of cigarette smoking in Italy over the next three decades.
Carreras, Giulia; Gorini, Giuseppe; Gallus, Silvano; Iannucci, Laura; Levy, David T
2012-10-01
Smoking prevalence in Italy decreased by 37% from 1980 to now. This is due to changes in smoking initiation and cessation rates and is in part attributable to the development of tobacco control policies. This work aims to estimate the age- and sex-specific smoking initiation and cessation probabilities for different time periods and to predict the future smoking prevalence in Italy, assuming different scenarios. A dynamic model describing the evolution of current, former and never smokers was developed. Cessation and relapse rates were estimated by fitting the model with smoking prevalence in Italy, 1986-2009. The estimated parameters were used to predict prevalence, according to scenarios: (1) 2000-09 initiation/cessation; (2) half initiation; (3) double cessation; (4) Scenarios 2+3; (5) triple cessation; and (6) Scenarios 2+5. Maintaining the 2000-09 initiation/cessation, the 10% goal will not be achieved within next three decades: prevalence will stabilize at 12.1% for women and 20.3% for men. The goal could be rapidly achieved for women by halving initiation and tripling cessation (9.9%, 2016), or tripling cessation only (10.4%, 2017); for men halving initiation and tripling cessation (10.8%, 2024), or doubling cessation and halving initiation (10.5%, 2033), or tripling cessation only (10.8%, 2033). The 10% goal will be achieved within the next few decades, mainly by increasing smoking cessation. Policies to reach this goal would include increasing cigarette taxes, introducing total reimbursement of smoking cessation treatment, with a further development of quitlines and smoking cessation services. These measures are not yet fully implemented in Italy.
Adam, Jennifer C.; Stephens, Jennie C.; Chung, Serena H.; ...
2014-04-24
Uncertainties in global change impacts, the complexities associated with the interconnected cycling of nitrogen, carbon, and water present daunting management challenges. Existing models provide detailed information on specific sub-systems (e.g., land, air, water, and economics). An increasing awareness of the unintended consequences of management decisions resulting from interconnectedness of these sub-systems, however, necessitates coupled regional earth system models (EaSMs). Decision makers’ needs and priorities can be integrated into the model design and development processes to enhance decision-making relevance and “usability” of EaSMs. BioEarth is a research initiative currently under development with a focus on the U.S. Pacific Northwest region thatmore » explores the coupling of multiple stand-alone EaSMs to generate usable information for resource decision-making. Direct engagement between model developers and non-academic stakeholders involved in resource and environmental management decisions throughout the model development process is a critical component of this effort. BioEarth utilizes a bottom-up approach for its land surface model that preserves fine spatial-scale sensitivities and lateral hydrologic connectivity, which makes it unique among many regional EaSMs. Here, we describe the BioEarth initiative and highlights opportunities and challenges associated with coupling multiple stand-alone models to generate usable information for agricultural and natural resource decision-making.« less
Parsec-Scale Obscuring Accretion Disk with Large-Scale Magnetic Field in AGNs
NASA Technical Reports Server (NTRS)
Dorodnitsyn, A.; Kallman, T.
2017-01-01
A magnetic field dragged from the galactic disk, along with inflowing gas, can provide vertical support to the geometrically and optically thick pc (parsec) -scale torus in AGNs (Active Galactic Nuclei). Using the Soloviev solution initially developed for Tokamaks, we derive an analytical model for a rotating torus that is supported and confined by a magnetic field. We further perform three-dimensional magneto-hydrodynamic simulations of X-ray irradiated, pc-scale, magnetized tori. We follow the time evolution and compare models that adopt initial conditions derived from our analytic model with simulations in which the initial magnetic flux is entirely contained within the gas torus. Numerical simulations demonstrate that the initial conditions based on the analytic solution produce a longer-lived torus that produces obscuration that is generally consistent with observed constraints.
Parsec-scale Obscuring Accretion Disk with Large-scale Magnetic Field in AGNs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorodnitsyn, A.; Kallman, T.
A magnetic field dragged from the galactic disk, along with inflowing gas, can provide vertical support to the geometrically and optically thick pc-scale torus in AGNs. Using the Soloviev solution initially developed for Tokamaks, we derive an analytical model for a rotating torus that is supported and confined by a magnetic field. We further perform three-dimensional magneto-hydrodynamic simulations of X-ray irradiated, pc-scale, magnetized tori. We follow the time evolution and compare models that adopt initial conditions derived from our analytic model with simulations in which the initial magnetic flux is entirely contained within the gas torus. Numerical simulations demonstrate thatmore » the initial conditions based on the analytic solution produce a longer-lived torus that produces obscuration that is generally consistent with observed constraints.« less
NASA Technical Reports Server (NTRS)
Chang, Chia-Bo
1994-01-01
This study is intended to examine the impact of the synthetic relative humidity on the model simulation of mesoscale convective storm environment. The synthetic relative humidity is derived from the National Weather Services surface observations, and non-conventional sources including aircraft, radar, and satellite observations. The latter sources provide the mesoscale data of very high spatial and temporal resolution. The synthetic humidity data is used to complement the National Weather Services rawinsonde observations. It is believed that a realistic representation of initial moisture field in a mesoscale model is critical for the model simulation of thunderstorm development, and the formation of non-convective clouds as well as their effects on the surface energy budget. The impact will be investigated based on a real-data case study using the mesoscale atmospheric simulation system developed by Mesoscale Environmental Simulations Operations, Inc. The mesoscale atmospheric simulation system consists of objective analysis and initialization codes, and the coarse-mesh and fine-mesh dynamic prediction models. Both models are a three dimensional, primitive equation model containing the essential moist physics for simulating and forecasting mesoscale convective processes in the atmosphere. The modeling system is currently implemented at the Applied Meteorology Unit, Kennedy Space Center. Two procedures involving the synthetic relative humidity to define the model initial moisture fields are considered. It is proposed to perform several short-range (approximately 6 hours) comparative coarse-mesh simulation experiments with and without the synthetic data. They are aimed at revealing the model sensitivities should allow us both to refine the specification of the observational requirements, and to develop more accurate and efficient objective analysis schemes. The goal is to advance the MASS (Mesoscal Atmospheric Simulation System) modeling expertise so that the model output can provide reliable guidance for thunderstorm forecasting.
Franco-Trigo, L; Tudball, J; Fam, D; Benrimoj, S I; Sabater-Hernández, D
2018-02-21
Collaboration between relevant stakeholders in health service planning enables service contextualization and facilitates its success and integration into practice. Although community pharmacy services (CPSs) aim to improve patients' health and quality of life, their integration in primary care is far from ideal. Key stakeholders for the development of a CPS intended at preventing cardiovascular disease were identified in a previous stakeholder analysis. Engaging these stakeholders to create a shared vision is the subsequent step to focus planning directions and lay sound foundations for future work. This study aims to develop a stakeholder-shared vision of a cardiovascular care model which integrates community pharmacists and to identify initiatives to achieve this vision. A participatory visioning exercise involving 13 stakeholders across the healthcare system was performed. A facilitated workshop, structured in three parts (i.e., introduction; developing the vision; defining the initiatives towards the vision), was designed. The Chronic Care Model inspired the questions that guided the development of the vision. Workshop transcripts, researchers' notes and materials produced by participants were analyzed using qualitative content analysis. Stakeholders broadened the objective of the vision to focus on the management of chronic diseases. Their vision yielded 7 principles for advanced chronic care: patient-centered care; multidisciplinary team approach; shared goals; long-term care relationships; evidence-based practice; ease of access to healthcare settings and services by patients; and good communication and coordination. Stakeholders also delineated six environmental factors that can influence their implementation. Twenty-four initiatives to achieve the developed vision were defined. The principles and factors identified as part of the stakeholder shared-vision were combined in a preliminary model for chronic care. This model and initiatives can guide policy makers as well as healthcare planners and researchers to develop and integrate chronic disease services, namely CPSs, in real-world settings. Copyright © 2018 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Judith C.
The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less
Intra-Engine Trace Species Chemistry
NASA Technical Reports Server (NTRS)
Waitz, Ian A.; Lukachko, S. P.; Chobot, A.; Miake-Lye, R. C.; Brown, R.
2002-01-01
Prompted by the needs of downstream plume-wake models, the Massachusetts Institute of Technology (MIT) and Aerodyne Research Incorporated (ART) initiated a collaborative effort, with funding from the NASA AEAP, to develop tools that would assist in understanding the fundamental drivers of chemical change within the intra-engine exhaust flow path. Efforts have been focused on the development of a modeling methodology that can adequately investigate the complex intra-engine environment. Over the history of this project, our research has increasingly pointed to the intra-engine environment as a possible site for important trace chemical activity. Modeling studies we initiated for the turbine and exhaust nozzle have contributed several important capabilities to the atmospheric effects of aviation assessment. These include a more complete understanding of aerosol precursor production, improved initial conditions for plume-wake modeling studies, and a more comprehensive analysis of ground-based test cell and in-flight exhaust measurement data. In addition, establishing a physical understanding of important flow and chemical processes through computational investigations may eventually assist in the design of engines to reduce undesirable species.
Life cycle cost modeling of conceptual space vehicles
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This paper documents progress to date by the University of Dayton on the development of a life cycle cost model for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of a life cycle cost model. Cost categories are initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. The focus will be on operations and maintenance costs and other recurring costs. Secondary tasks performed concurrent with the development of the life cycle costing model include continual support and upgrade of the R&M model. The primary result of the completed research will be a methodology and a computer implementation of the methodology to provide for timely cost analysis in support of the conceptual design activities. The major objectives of this research are: to obtain and to develop improved methods for estimating manpower, spares, software and hardware costs, facilities costs, and other cost categories as identified by NASA personnel; to construct a life cycle cost model of a space transportation system for budget exercises and performance-cost trade-off analysis during the conceptual and development stages; to continue to support modifications and enhancements to the R&M model; and to continue to assist in the development of a simulation model to provide an integrated view of the operations and support of the proposed system.
Han, Seunghoon; Jeon, Sangil; Hong, Taegon; Lee, Jongtae; Bae, Soo Hyeon; Park, Wan-su; Park, Gab-jin; Youn, Sunil; Jang, Doo Yeon; Kim, Kyung-Soo; Yim, Dong-Seok
2015-01-01
No wholly successful weight-control drugs have been developed to date, despite the tremendous demand. We present an exposure–response model of sibutramine mesylate that can be applied during clinical development of other weight-control drugs. Additionally, we provide a model-based evaluation of sibutramine efficacy. Data from a double-blind, randomized, placebo-controlled, multicenter study were used (N=120). Subjects in the treatment arm were initially given 8.37 mg sibutramine base daily, and those who lost <2 kg after 4 weeks’ treatment were escalated to 12.55 mg. The duration of treatment was 24 weeks. Drug concentration and body weight were measured predose and at 4 weeks, 8 weeks, and 24 weeks after treatment initiation. Exposure and response to sibutramine, including the placebo effect, were modeled using NONMEM 7.2. An asymptotic model approaching the final body weight was chosen to describe the time course of weight loss. Extent of weight loss was described successfully using a sigmoidal exposure–response relationship of the drug with a constant placebo effect in each individual. The placebo effect was influenced by subjects’ sex and baseline body mass index. Maximal weight loss was predicted to occur around 1 year after treatment initiation. The difference in mean weight loss between the sibutramine (daily 12.55 mg) and placebo groups was predicted to be 4.5% in a simulation of 1 year of treatment, with considerable overlap of prediction intervals. Our exposure–response model, which included the placebo effect, is the first example of a quantitative model that can be used to predict the efficacy of weight-control drugs. Similar approaches can help decision-making during clinical development of novel weight-loss drugs. PMID:26392753
Han, Seunghoon; Jeon, Sangil; Hong, Taegon; Lee, Jongtae; Bae, Soo Hyeon; Park, Wan-su; Park, Gab-jin; Youn, Sunil; Jang, Doo Yeon; Kim, Kyung-Soo; Yim, Dong-Seok
2015-01-01
No wholly successful weight-control drugs have been developed to date, despite the tremendous demand. We present an exposure-response model of sibutramine mesylate that can be applied during clinical development of other weight-control drugs. Additionally, we provide a model-based evaluation of sibutramine efficacy. Data from a double-blind, randomized, placebo-controlled, multicenter study were used (N=120). Subjects in the treatment arm were initially given 8.37 mg sibutramine base daily, and those who lost <2 kg after 4 weeks' treatment were escalated to 12.55 mg. The duration of treatment was 24 weeks. Drug concentration and body weight were measured predose and at 4 weeks, 8 weeks, and 24 weeks after treatment initiation. Exposure and response to sibutramine, including the placebo effect, were modeled using NONMEM 7.2. An asymptotic model approaching the final body weight was chosen to describe the time course of weight loss. Extent of weight loss was described successfully using a sigmoidal exposure-response relationship of the drug with a constant placebo effect in each individual. The placebo effect was influenced by subjects' sex and baseline body mass index. Maximal weight loss was predicted to occur around 1 year after treatment initiation. The difference in mean weight loss between the sibutramine (daily 12.55 mg) and placebo groups was predicted to be 4.5% in a simulation of 1 year of treatment, with considerable overlap of prediction intervals. Our exposure-response model, which included the placebo effect, is the first example of a quantitative model that can be used to predict the efficacy of weight-control drugs. Similar approaches can help decision-making during clinical development of novel weight-loss drugs.
Effect of initiator concentration to low-density polyethylene production in a tubular reactor
NASA Astrophysics Data System (ADS)
Azmi, A.; Aziz, N.
2016-11-01
Low-density polyethylene (LDPE) is one of the most widely used polymers in the world, which is produced in high-capacity tubular and autoclave reactors. As the LDPE industry turn into more competitive and its market profit margins become tighter, manufacturers have to develop solutions to debottleneck the reactor output while abiding to the stringent product specification. A single polyolefin plant producing ten to forty grades of LDPE with various melt flow index (MFI), therefore understanding the reaction mechanism, the operating conditions as well as the dynamic behavior of tubular reactor is essential before any improvement can take place. In the present work, a steady state mathematical model representing a tubular reactor for the production of LDPE is simulated using MATLAB R2015a®. The model developed is a function of feed inlet, reactor jacket, single initiator injector and outlet stream. Analysis on the effect of initiator concentration (CI) shows sudden declining trend of initiator's concentration which indicates that all of the initiators are exhausted after polymerization reaction and no further reaction occur from this point onwards. Furthermore, the results demonstrate that the concentration of initiator gives significant impact on reactor temperature's profile and monomer conversion rate, since higher initiator concentration promotes greater polymerization rate, and therefore leads to higher monomer conversion throughput.
Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykes, K.; Graf, P.; Scott, G.
2015-01-01
The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less
OPM: The Open Porous Media Initiative
NASA Astrophysics Data System (ADS)
Flemisch, B.; Flornes, K. M.; Lie, K.; Rasmussen, A.
2011-12-01
The principal objective of the Open Porous Media (OPM) initiative is to develop a simulation suite that is capable of modeling industrially and scientifically relevant flow and transport processes in porous media and bridge the gap between the different application areas of porous media modeling, including reservoir mechanics, CO2 sequestration, biological systems, and product development of engineered media. The OPM initiative will provide a long-lasting, efficient, and well-maintained open-source software for flow and transport in porous media built on modern software principles. The suite is released under the GNU General Public License (GPL). Our motivation is to provide a means to unite industry and public research on simulation of flow and transport in porous media. For academic users, we seek to provide a software infrastructure that facilitates testing of new ideas on models with industry-standard complexity, while at the same time giving the researcher control over discretization and solvers. Similarly, we aim to accelerate the technology transfer from academic institutions to professional companies by making new research results available as free software of professional standard. The OPM initiative is currently supported by six research groups in Norway and Germany and funded by existing grants from public research agencies as well as from Statoil Petroleum and Total E&P Norge. However, a full-scale development of the OPM initiative requires substantially more funding and involvement of more research groups and potential end users. In this talk, we will provide an overview of the current activities in the OPM initiative. Special emphasis will be given to the demonstration of the synergies achieved by combining the strengths of individual open-source software components. In particular, a new fully implicit solver developed within the DUNE-based simulator DuMux could be enhanced by the ability to read industry-standard Eclipse input files and to run on grids given in corner-point format. Examples taken from the SPE comparative solution projects and CO2 sequestration benchmarks illustrate the current capabilities of the simulation suite.
Application of the cognitive therapy model to initial crisis assessment.
Calvert, Patricia; Palmer, Christine
2003-03-01
This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.
Martin, Susan Christie; Greenhouse, Pamela K; Merryman, Tamra; Shovel, Judith; Liberi, Cindy A; Konzier, Jeannine
2007-10-01
Institute of Medicine reports provide evidence of the failings of the healthcare system in the United States and a vision of the required transformation. The Institute for Healthcare Improvement and the Robert Wood Johnson Foundation created the Transforming Care at the Bedside initiative in 2003 to develop and validate a process for transforming care in hospital medical-surgical units. The authors describe Transforming Care at the Bedside as implemented by one of Institute for Healthcare Improvement/Robert Wood Johnson's initial pilot hospitals, including promising outcomes and a model for spreading the initiative.
Modeling emissions of volatile organic compounds from silage storages and feed lanes
USDA-ARS?s Scientific Manuscript database
An initial volatile organic compound (VOC) emission model for silage sources, developed using experimental data from previous studies, was incorporated into the Integrated Farm System Model (IFSM), a whole-farm simulation model used to assess the performance, environmental impacts, and economics of ...
Modeling Type IIn Supernovae: Understanding How Shock Development Effects Light Curves Properties
NASA Astrophysics Data System (ADS)
De La Rosa, Janie
2016-06-01
Type IIn supernovae are produced when massive stars experience dramatic mass loss phases caused by opacity edges or violent explosions. Violent mass ejections occur quite often just prior to the collapse of the star. If the final episode happens just before collapse, the outward ejecta is sufficiently dense to alter the supernova light-curve, both by absorbing the initial supernova light and producing emission when the supernova shock hits the ejecta. Initially, the ejecta is driven by shock progating through the interior of the star, and eventually expands through the circumstellar medium, forming a cold dense shell. As the shock wave approaches the shell, there is an increase in UV and optical radiation at the location of the shock breakout. We have developed a suite of simple semi-analytical models in order to understand the relationship between our observations and the properties of the expanding SN ejecta. When we compare Type IIn observations to a set of modeled SNe, we begin to see the influence of initial explosion conditions on early UV light curve properties such as peak luminosities and decay rate.The fast rise and decay corresponds to the models representing a photosphere moving through the envelope, while the modeled light curves with a slower rise and decay rate are powered by 56Ni decay. However, in both of these cases, models that matched the luminosity were unable to match the low radii from the blackbody models. The effect of shock heating as the supernova material blasts through the circumstellar material can drastically alter the temperature and position of the photosphere. The new set of models redefine the initial modeling conditions to incorporate an outer shell-like structure, and include late-time shock heating from shocks produced as the supernova ejecta travels through the inhomogeneous circumstellar medium.
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1988-01-01
The initial effort was concentrated on developing the quasi-analytical approach for two-dimensional transonic flow. To keep the problem computationally efficient and straightforward, only the two-dimensional flow was considered and the problem was modeled using the transonic small perturbation equation.
NASA Astrophysics Data System (ADS)
Wolfsberg, A.; Hagood, M.; Pasqualini, D.; Wood, T.; Wilson, C.; Witkowski, M.; Levitt, D.; Pawar, R.; Keating, G.; Ziock, H.
2008-12-01
The United States is increasingly dependent on imported oil and gas; commodities for which other nations are competing and for which future supply may be inadequate to support our transportation fuel needs. Therefore, a renewed interest in 'harder-to-get' unconventional fuels has emerged in both industry and government with directed focus on world class hydrocarbon resources within a corridor extending from Canada southward through the Rocky Mountain States. Within this Western Energy Corridor, co-located with significant conventional hydrocarbon and renewable energy resources, lie some of the world's richest unconventional hydrocarbon resources in oil shales, oil sands and coal for coal-to-liquid conversion. However, development of these resources poses substantial environmental concerns as well as increasing competition for limited resources of water and habitat. With large-scale energy development in the predominantly rural region, local communities, infrastructures, and economies will face increasing demands for roads, electricity, law enforcement, labor, and other support services. The Western Energy Corridor Initiative (WECI) seeks to develop an integrated assessment of the impacts of unconventional fuel development, the interrelationships of planned energy developments in different basins, and the resultant demands placed on the region. This initial WECI study focuses on two of the most important current issues for industry, regulators, and stakeholders -- the assessment of carbon and water resources issues, impacts, and management strategies. Through scenario analyses using coupled systems and process level models, this study investigates the viability of integrated development of multiple energy resources in a carbon neutral and environmentally acceptable manner, and the interrelationships of various energy resource development plans. The modeling framework is designed to extend to include infrastructure, employment, training, fiscal and economic demands placed on the region as a result of various development and climate change scenarios. The multi-scale modeling approach involves a systems dynamics (SD) modeling framework linked with more detailed models such as one for basin-scale hydrology investigating the spatial relationships of water rights and requirements, reservoir locations, and climate change impacts (the details of the SD model and the hydrologic model are presented in other contributions by Pasqualini et al. and Wilson et al.). A link to a CO2 sequestration performance assessment model is also being built to enable analysis of alternative carbon management options. With these evolving capabilities, our analyses consider interdependent demands and impacts placed on the region for various development scenarios.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2011-12-01
The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.
Talip, Tajidah; Murang, Zaidah; Kifli, Nurolaini; Naing, Lin
2016-01-01
A recent WHO data report on mortality attributable to tobacco use including cigarette smoking indicated a very high burden of deaths in Asia and that people often initiate smoking as early as young adolescents. The objectives of this study were to systematically review peerreviewed articles on cigarette smoking initiation among Asian adolescents and to develop a conceptual model of factors influencing smoking initiation by integrating all relevant factors based on existing data. Following a PRISMA guideline, a systematic review of articles published between 2005 and June 2015 was conducted using 5 databases on cigarette smoking initiation among adolescents (aged 1019 years) living in Asia. We summarized the main findings of each study according to our research questions and data that emerged during the data extraction process. Analysis and categorization were based on the TTI and TPB models and classification of factors extracted from the study, were as follows: personal factors, social factors, broader environmental factors, mediators, and intention to initiate smoking and smoking behavior. Of 1,227 identified studies, only 20 were included in this review. Our findings found that the mean age of cigarette smoking initiation ranged from 10 to 14 years and those who are more likely to initiate smoking are male, older adolescents, adolescents with low parental SES, individuals with low parental monitoring, low parental education level and having no discussion on smoking at home, those living in public housing and those exhibiting healthrisk behavior. Our study also revealed that the risk of smoking initiation increased when they are exposed to smokers, influenced by peers, exposed to tobacco advertisements, receive pocket money, have lack of knowledge about smoking, have poor school performance, have a family conflict and have psychological problems. The conceptual model developed demonstrated complex networks of factors influencing initiation. This systematic review presents various factors influencing smoking initiation of the Asian adolescents and provides a conceptual framework to further analyze factors. Future studies should have a standard measure of smoking initiation, should analyze interactions and the intensity of relationships between different factors or variables in the conceptual model. This will in turn consolidate the understanding of the different factors affecting smoking initiation and will help to improve interventions in this area.
1977-10-01
These modules make up a multi-task priority real - time operating system in which each of the functions of the Supervisor is performed by one or more tasks. The Initialization module performs the initialization of the Supervisor software and hardware including the Input Buffer, the FIFO, and the Track Correlator This module is used both at initial program load time and upon receipt of a SC Initialization Command.
Ripple effects of reform on capital financing.
Arduino, Kelly
2014-05-01
Healthcare leaders should inventory and quantify the capital initiatives deemed critical for success under changing business models. Key considerations in planning such initiatives are opportunity costs and potential impact on productivity. Senior leaders also should create rolling five-year estimates of expenditures in addition to a one-year budget. Approaches to paying for such initiatives include borrowing from cash reserves, partnering to share cash and other resources, and developing new revenue sources derived from the initiatives themselves.
NASA Astrophysics Data System (ADS)
Liu, Ruiwen; Jiao, Binbin; Kong, Yanmei; Li, Zhigang; Shang, Haiping; Lu, Dike; Gao, Chaoqun; Chen, Dapeng
2013-09-01
Micro-devices with a bi-material-cantilever (BMC) commonly suffer initial curvature due to the mismatch of residual stress. Traditional corrective methods to reduce the residual stress mismatch generally involve the development of different material deposition recipes. In this paper, a new method for reducing residual stress mismatch in a BMC is proposed based on various previously developed deposition recipes. An initial material film is deposited using two or more developed deposition recipes. This first film is designed to introduce a stepped stress gradient, which is then balanced by overlapping a second material film on the first and using appropriate deposition recipes to form a nearly stress-balanced structure. A theoretical model is proposed based on both the moment balance principle and total equal strain at the interface of two adjacent layers. Experimental results and analytical models suggest that the proposed method is effective in producing multi-layer micro cantilevers that display balanced residual stresses. The method provides a generic solution to the problem of mismatched initial stresses which universally exists in micro-electro-mechanical systems (MEMS) devices based on a BMC. Moreover, the method can be incorporated into a MEMS design automation package for efficient design of various multiple material layer devices from MEMS material library and developed deposition recipes.
2016-07-27
is a common requirement for aircraft, rockets , and hypersonic vehicles. The Aerospace Fuels Quality Test and Model Development (AFQTMoDev) project...was initiated to mature fuel quality assurance practices for rocket grade kerosene, thereby ensuring operational readiness of conventional and...and reliability, is a common requirement for aircraft, rockets , and hypersonic vehicles. The Aerospace Fuels Quality Test and Model Development
Intentional Development: A Model to Guide Lifelong Physical Activity
ERIC Educational Resources Information Center
Cherubini, Jeffrey M.
2009-01-01
Framed in the context of researching influences on physical activity and actually working with individuals and groups seeking to initiate, increase or maintain physical activity, the purpose of this review is to present the model of Intentional Development as a multi-theoretical approach to guide research and applied work in physical activity.…
Lake Michigan Mass Balance Project (LMMBP) was initiated to directly support the development of a lakewide management plan (LaMP) for Lake Michigan. A mass balance modeling approach is proposed for the project to addrss the realtionship between sources of toxic chemicals and thei...
Clinical Preparation of Teachers in the Context of a University-Wide Community Engagement Emphasis
ERIC Educational Resources Information Center
Evans-Andris, Melissa; Kyle, Diane W.; Larson, Ann E.; Buecker, Harrie; Haselton, W. Blake; Howell, Penny; Sheffield, Caroline; Sherretz, Christine; Weiland, Ingrid
2014-01-01
In this article, we describe development of a clinical model of teacher education connected to a community engagement commitment of the university known as the Signature Partnership Initiative. The current clinical model builds upon previously established collaborations of the College of Education and Human Development with district and school…
DOT National Transportation Integrated Search
2010-12-01
A number of initiatives were undertaken to support education, training, and technology transfer objectives related to UAB UTC Domain 2 Project: Development of a Dynamic Traffic Assignment and Simulation Model for Incident and Emergency Management App...
This research makes use of in vitro and in vivo approaches to understand and discriminate the compensatory and toxicological responses of the highly regulated HPT system. Development of an initial systems model will be based on the current understanding of the HPT axis and the co...
A Transient Initialization Routine of the Community Ice Sheet Model for the Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
van der Laan, Larissa; van den Broeke, Michiel; Noël, Brice; van de Wal, Roderik
2017-04-01
The Community Ice Sheet Model (CISM) is to be applied in future simulations of the Greenland Ice Sheet under a range of climate change scenarios, determining the sensitivity of the ice sheet to individual climatic forcings. In order to achieve reliable results regarding ice sheet stability and assess the probability of future occurrence of tipping points, a realistic initial ice sheet geometry is essential. The current work describes and evaluates the development of a transient initialization routine, using NGRIP 18O isotope data to create a temperature anomaly field. Based on the latter, surface mass balance components runoff and precipitation are perturbed for the past 125k years. The precipitation and runoff fields originate from a downscaled 1 km resolution version of the regional climate model RACMO2.3 for the period 1961-1990. The result of the initialization routine is a present-day ice sheet with a transient memory of the last glacial-interglacial cycle, which will serve as the future runs' initial condition.
Gehrs, Margaret; Strudwick, Gillian; Ling, Sara; Reisdorfer, Emilene; Cleverley, Kristin
2017-01-01
Mental health and addictions services are integral to Canada's healthcare system, and yet it is difficult to recruit experienced nurse leaders with advanced practice, management or clinical informatics expertise in this field. Master's-level graduates, aspiring to be mental health nurse leaders, often lack the confidence and experience required to lead quality improvement, advancements in clinical care, service design and technology innovations for improved patient care. This paper describes an initiative that develops nursing leaders through a unique scholarship, internship and mentorship model, which aims to foster confidence, critical thinking and leadership competency development in the mental health and addictions context. The "Mutual Benefits Model" framework was applied in the design and evaluation of the initiative. It outlines how mentee, mentor and organizational needs can drive strategic planning of resource investment, mentorship networks and relevant leadership competency-based learning plans to optimize outcomes. Five-year individual and organizational outcomes are described. © 2017 Longwoods Publishing.
COMPUTATIONAL TOXICOLOGY: FRAMEWORK, PARTNERSHIPS, AND PROGRAM DEVELOPMENT
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
On spatial mutation-selection models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kondratiev, Yuri, E-mail: kondrat@math.uni-bielefeld.de; Kutoviy, Oleksandr, E-mail: kutoviy@math.uni-bielefeld.de, E-mail: kutovyi@mit.edu; Department of Mathematics, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139
2013-11-15
We discuss the selection procedure in the framework of mutation models. We study the regulation for stochastically developing systems based on a transformation of the initial Markov process which includes a cost functional. The transformation of initial Markov process by cost functional has an analytic realization in terms of a Kimura-Maruyama type equation for the time evolution of states or in terms of the corresponding Feynman-Kac formula on the path space. The state evolution of the system including the limiting behavior is studied for two types of mutation-selection models.
NASA Technical Reports Server (NTRS)
Mateev, L. N.; Nenovski, P. I.; Vellinov, P. I.
1989-01-01
In connection with the recently detected quasiperiodical magnetic disturbances in the ionospheric cusp, the penetration of compressional surface magnetohydrodynamic (MHD) waves through the middle atmosphere is modelled numerically. For the COSPAR International Reference Atmosphere (CIRA) 72 model the respective energy density flux of the disturbances in the middle atmosphere is determined. On the basis of the developed model certain conclusions are reached about the height distribution of the structures (energy losses, currents, etc.) initiated by intensive magnetic cusp disturbances.
Spertzel, R O
1989-12-01
The search for a model of HIV infection continues. While much of the initial work focussed on animal models of AIDS, more recent efforts have sought animal models of HIV infection in which one or more signs of AIDS may be reproduced. Most initial small animal modelling efforts were negative and many such efforts remain unpublished. In 1988, the Public Health Service (PHS) AIDS Animal Model Committee conducted a survey among PHS agencies to identify published and unpublished data on animal models of HIV. To date, the chimpanzee is the only animal to be reliably infected with HIV albeit without development of signs and symptoms normally associated with human AIDS. One recent study has shown the gibbon to be similarly susceptible to infection with HIV. Mice carrying a chimera of elements of the human immune system have been shown to support the growth of HIV and F1 progeny of transgenic mice containing intact copies of HIV proviral DNA, have developed a disease that resembles some aspects of human AIDS. Rabbits, baboons and rhesus monkeys have also been shown to be infected under certain conditions and/or with selected strains of HIV but again without the development of AIDS symptomatology. This report briefly summarizes published and available unpublished data on these efforts to develop an animal model of HIV infection.
Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results
ERIC Educational Resources Information Center
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-01-01
We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…
Porting Inition and Failure to Linked Cheetah
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vitello, P; Souers, P C
2007-07-18
Linked CHEETAH is a thermo-chemical code coupled to a 2-D hydrocode. Initially, a quadratic-pressure dependent kinetic rate was used, which worked well in modeling prompt detonation of explosives of large size, but does not work on other aspects of explosive behavior. The variable-pressure Tarantula reactive flow rate model was developed with JWL++ in order to also describe failure and initiation, and we have moved this model into Linked CHEETAH. The model works by turning on only above a pressure threshold, where a slow turn-on creates initiation. At a higher pressure, the rate suddenly leaps to a large value over amore » small pressure range. A slowly failing cylinder will see a rapidly declining rate, which pushes it quickly into failure. At a high pressure, the detonation rate is constant. A sequential validation procedure is used, which includes metal-confined cylinders, rate-sticks, corner-turning, initiation and threshold, gap tests and air gaps. The size (diameter) effect is central to the calibration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Pooja Nitin; Shin, Yung C.; Sun, Tao
Synchrotron X-rays are integrated with a modified Kolsky tension bar to conduct in situ tracking of the grain refinement mechanism operating during the dynamic deformation of metals. Copper with an initial average grain size of 36 μm is refined to 6.3 μm when loaded at a constant high strain rate of 1200 s -1. The synchrotron measurements revealed the temporal evolution of the grain refinement mechanism in terms of the initiation and rate of refinement throughout the loading test. A multiscale coupled probabilistic cellular automata based recrystallization model has been developed to predict the microstructural evolution occurring during dynamic deformationmore » processes. The model accurately predicts the initiation of the grain refinement mechanism with a predicted final average grain size of 2.4 μm. As a result, the model also accurately predicts the temporal evolution in terms of the initiation and extent of refinement when compared with the experimental results.« less
Shah, Pooja Nitin; Shin, Yung C.; Sun, Tao
2017-10-03
Synchrotron X-rays are integrated with a modified Kolsky tension bar to conduct in situ tracking of the grain refinement mechanism operating during the dynamic deformation of metals. Copper with an initial average grain size of 36 μm is refined to 6.3 μm when loaded at a constant high strain rate of 1200 s -1. The synchrotron measurements revealed the temporal evolution of the grain refinement mechanism in terms of the initiation and rate of refinement throughout the loading test. A multiscale coupled probabilistic cellular automata based recrystallization model has been developed to predict the microstructural evolution occurring during dynamic deformationmore » processes. The model accurately predicts the initiation of the grain refinement mechanism with a predicted final average grain size of 2.4 μm. As a result, the model also accurately predicts the temporal evolution in terms of the initiation and extent of refinement when compared with the experimental results.« less
Specifying and Refining a Complex Measurement Model.
ERIC Educational Resources Information Center
Levy, Roy; Mislevy, Robert J.
This paper aims to describe a Bayesian approach to modeling and estimating cognitive models both in terms of statistical machinery and actual instrument development. Such a method taps the knowledge of experts to provide initial estimates for the probabilistic relationships among the variables in a multivariate latent variable model and refines…
The Incorporation and Initialization of Cloud Water/ice in AN Operational Forecast Model
NASA Astrophysics Data System (ADS)
Zhao, Qingyun
Quantitative precipitation forecasts have been one of the weakest aspects of numerical weather prediction models. Theoretical studies show that the errors in precipitation calculation can arise from three sources: errors in the large-scale forecasts of primary variables, errors in the crude treatment of condensation/evaporation and precipitation processes, and errors in the model initial conditions. A new precipitation parameterization scheme has been developed to investigate the forecast value of improved precipitation physics via the introduction of cloud water and cloud ice into a numerical prediction model. The main feature of this scheme is the explicit calculation of cloud water and cloud ice in both the convective and stratiform precipitation parameterization. This scheme has been applied to the eta model at the National Meteorological Center. Four extensive tests have been performed. The statistical results showed a significant improvement in the model precipitation forecasts. Diagnostic studies suggest that the inclusion of cloud ice is important in transferring water vapor to precipitation and in the enhancement of latent heat release; the latter subsequently affects the vertical motion field significantly. Since three-dimensional cloud data is absent from the analysis/assimilation system for most numerical models, a method has been proposed to incorporate observed precipitation and nephanalysis data into the data assimilation system to obtain the initial cloud field for the eta model. In this scheme, the initial moisture and vertical motion fields are also improved at the same time as cloud initialization. The physical initialization is performed in a dynamical initialization framework that uses the Newtonian dynamical relaxation method to nudge the model's wind and mass fields toward analyses during a 12-hour data assimilation period. Results from a case study showed that a realistic cloud field was produced by this method at the end of the data assimilation period. Precipitation forecasts have been significantly improved as a result of the improved initial cloud, moisture and vertical motion fields.
Linking models and data on vegetation structure
NASA Astrophysics Data System (ADS)
Hurtt, G. C.; Fisk, J.; Thomas, R. Q.; Dubayah, R.; Moorcroft, P. R.; Shugart, H. H.
2010-06-01
For more than a century, scientists have recognized the importance of vegetation structure in understanding forest dynamics. Now future satellite missions such as Deformation, Ecosystem Structure, and Dynamics of Ice (DESDynI) hold the potential to provide unprecedented global data on vegetation structure needed to reduce uncertainties in terrestrial carbon dynamics. Here, we briefly review the uses of data on vegetation structure in ecosystem models, develop and analyze theoretical models to quantify model-data requirements, and describe recent progress using a mechanistic modeling approach utilizing a formal scaling method and data on vegetation structure to improve model predictions. Generally, both limited sampling and coarse resolution averaging lead to model initialization error, which in turn is propagated in subsequent model prediction uncertainty and error. In cases with representative sampling, sufficient resolution, and linear dynamics, errors in initialization tend to compensate at larger spatial scales. However, with inadequate sampling, overly coarse resolution data or models, and nonlinear dynamics, errors in initialization lead to prediction error. A robust model-data framework will require both models and data on vegetation structure sufficient to resolve important environmental gradients and tree-level heterogeneity in forest structure globally.
Predicatbility of windstorm Klaus; sensitivity to PV perturbations
NASA Astrophysics Data System (ADS)
Arbogast, P.; Maynard, K.
2010-09-01
It appears that some short-range weather forecast failures may be attributed to initial conditions errors. In some cases it is possible to anticipate the behavior of the model by comparison between observations and model analyses. In the case of extratropical cyclone development one may qualify the representation of the upper-level precursors described in terms of PV in the initial conditions by comparison with either satellite ozone or water-vapor. A step forward has been made in developing a tool based upon manual modifications of dynamical tropopause (i.e. height of 1.5 PV units) and PV inversion. After five years of experimentations it turns out that the forecasters eventually succeed in improving the forecast of some strong cyclone development. However the present approach is subjective per se. To measure the subjectivity of the procedure a set of 15 experiments has been performed provided by 7 different people (senior forecasters and scientists involved in dynamical meteorology) in order to improve an initial state of the global model ARPEGE leading to a poor forecast of the wind storm Klaus (24 January 2009). This experiment reveals that the manually defined corrections present common features but also a large spread.
Big Data Initiatives for Agroecosystems
USDA-ARS?s Scientific Manuscript database
NAL has developed a workspace for research groups associated with the i5k initiative, which aims to sequence the genomes of all insesct species known to be important to worldwide agriculture, food safety, medicine, and energy production; all those used as models in biology; the most abundant in worl...
Predictors of Sexual Intercourse among Korean Adolescents
ERIC Educational Resources Information Center
Ryu, Eunjung; Kim, Kyunghee; Kwon, Hyejin
2007-01-01
Background: The proportion of adolescents experiencing unwanted pregnancy and abortion caused by the premature initiation of sexual intercourse is increasing at an alarming rate in Korea. This study aimed at developing a theoretical model for identifying individual and environmental risk factors affecting the initiation of sexual intercourse by…
Neurocognitive predictors of financial capacity in traumatic brain injury.
Martin, Roy C; Triebel, Kristen; Dreer, Laura E; Novack, Thomas A; Turner, Crystal; Marson, Daniel C
2012-01-01
To develop cognitive models of financial capacity (FC) in patients with traumatic brain injury (TBI). Longitudinal design. Inpatient brain injury rehabilitation unit. Twenty healthy controls, and 24 adults with moderate-to-severe TBI were assessed at baseline (30 days postinjury) and 6 months postinjury. The FC instrument (FCI) and a neuropsychological test battery. Univariate correlation and multiple regression procedures were employed to develop cognitive models of FCI performance in the TBI group, at baseline and 6-month time follow-up. Three cognitive predictor models of FC were developed. At baseline, measures of mental arithmetic/working memory and immediate verbal memory predicted baseline FCI performance (R = 0.72). At 6-month follow-up, measures of executive function and mental arithmetic/working memory predicted 6-month FCI performance (R = 0.79), and a third model found that these 2 measures at baseline predicted 6-month FCI performance (R = 0.71). Multiple cognitive functions are associated with initial impairment and partial recovery of FC in moderate-to-severe TBI patients. In particular, arithmetic, working memory, and executive function skills appear critical to recovery of FC in TBI. The study results represent an initial step toward developing a neurocognitive model of FC in patients with TBI.
Technology Investments in the NASA Entry Systems Modeling Project
NASA Technical Reports Server (NTRS)
Barnhardt, Michael; Wright, Michael; Hughes, Monica
2017-01-01
The Entry Systems Modeling (ESM) technology development project, initiated in 2012 under NASAs Game Changing Development (GCD) Program, is engaged in maturation of fundamental research developing aerosciences, materials, and integrated systems products for entry, descent, and landing(EDL)technologies [1]. To date, the ESM project has published over 200 papers in these areas, comprising the bulk of NASAs research program for EDL modeling. This presentation will provide an overview of the projects successes and challenges, and an assessment of future investments in EDL modeling and simulation relevant to NASAs mission
Weldon, Christine B; Friedewald, Sarah M; Kulkarni, Swati A; Simon, Melissa A; Carlos, Ruth C; Strauss, Jonathan B; Bunce, Mikele M; Small, Art; Trosman, Julia R
2016-12-01
Radiologists aspire to improve patient experience and engagement, as part of the Triple Aim of health reform. Patient engagement requires active partnerships among health providers and patients, and rigorous teamwork provides a mechanism for this. Patient and care team engagement are crucial at the time of cancer diagnosis and care initiation but are complicated by the necessity to orchestrate many interdependent consultations and care events in a short time. Radiology often serves as the patient entry point into the cancer care system, especially for breast cancer. It is uniquely positioned to play the value-adding role of facilitating patient and team engagement during cancer care initiation. The 4R approach (Right Information and Right Care to the Right Patient at the Right Time), previously proposed for optimizing teamwork and care delivery during cancer treatment, could be applied at the time of diagnosis. The 4R approach considers care for every patient with cancer as a project, using project management to plan and manage care interdependencies, assign clear responsibilities, and designate a quarterback function. The authors propose that radiology assume the quarterback function during breast cancer care initiation, developing the care initiation sequence, as a project care plan for newly diagnosed patients, and engaging patients and their care teams in timely, coordinated activities. After initial consultations and treatment plan development, the quarterback function is transitioned to surgery or medical oncology. This model provides radiologists with opportunities to offer value-added services and solidifies radiology's relevance in the evolving health care environment. To implement 4R at cancer care initiation, it will be necessary to change the radiology practice model to incorporate patient interaction and teamwork, develop 4R content and local adaption approaches, and enrich radiology training with relevant clinical knowledge, patient interaction competence, and teamwork skill set. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Welch, Lisa C; Trudeau, Jeremiah J; Silverstein, Steven M; Sand, Michael; Henderson, David C; Rosen, Raymond C
2017-01-01
Cognitive impairment is a serious, often distressing aspect of schizophrenia that affects patients' day-to-day lives. Although several interview-based instruments exist to assess cognitive functioning, a reliable measure developed based on the experiences of patients facing cognitive difficulties is needed to complement the objective performance-based assessments. The present article describes the initial development of a patient-reported outcome (PRO) measure to assess the subjective experience of cognitive impairment among patients with schizophrenia, the Patient-Reported Experience of Cognitive Impairment in Schizophrenia (PRECIS). The phases of development included the construction of a conceptual model based on the existing knowledge and two sets of qualitative interviews with patients: 1) concept elicitation interviews to ensure face and content validity from the perspective of people with schizophrenia and 2) cognitive debriefing of the initial item pool. Input from experts was elicited throughout the process. The initial conceptual model included seven domains. The results from concept elicitation interviews (n=80) supported these domains but yielded substantive changes to concepts within domains and to terminology. Based on these results, an initial pool of 53 items was developed to reflect the most common descriptions and languages used by the study participants. Cognitive debriefing interviews (n=22) resulted in the removal of 18 items and modification of 22 other items. The remaining 35 items represented 23 concepts within six domains plus two items assessing bother. The draft PRO measure is currently undergoing psychometric testing as a precursor to broad-based clinical and research use.
NASA Technical Reports Server (NTRS)
Groth, Clinton P. T.; Roe, Philip L.
1998-01-01
Six months of funding was received for the proposed three year research program (funding for the period from March 1, 1997 to August 31, 1997). Although the official starting date for the project was March 1, 1997, no funding for the project was received until July 1997. In the funded research period, considerable progress was made on Phase I of the proposed research program. The initial research efforts concentrated on applying the 10-, 20-, and 35-moment Gaussian-based closures to a series of standard two-dimensional non-reacting single species test flow problems, such as the flat plate, couette, channel, and rearward facing step flows, and to some other two-dimensional flows having geometries similar to those encountered in chemical-vapor deposition (CVD) reactors. Eigensystem analyses for these systems for the case of two spatial dimensions was carried out and efficient formulations of approximate Riemann solvers have been formulated using these eigenstructures. Formulations to include rotational non-equilibrium effects into the moment closure models for the treatment of polyatomic gases were explored, as the original formulations of the closure models were developed strictly for gases composed of monatomic molecules. The development of a software library and computer code for solving relaxing hyperbolic systems in two spatial dimensions of the type arising from the closure models was also initiated. The software makes use of high-resolution upwind finite-volumes schemes, multi-stage point implicit time stepping, and automatic adaptive mesh refinement (AMR) to solve the governing conservation equations for the moment closures. The initial phase of the code development was completed and a numerical investigation of the solutions of the 10-moment closure model for the simple two-dimensional test cases mentioned above was initiated. Predictions of the 10-moment model were compared to available theoretical solutions and the results of direct-simulation Monte Carlo (DSMC) calculations. The first results of this study were presented at a meeting last year.
Kremers, S P J; Mudde, A N; De Vries, H
2004-05-01
Two lines of psychological research have attempted to spell out the stages of adolescent smoking initiation. The first has focused on behavioral stages of smoking initiation, while the second line emphasized motivational stages. A large international sample of European adolescents (N = 10,170, mean age = 13.3 years) was followed longitudinally. Self-reported motivational and behavioral stages of smoking initiation were integrated, leading to the development of the Model of Unplanned Smoking Initiation of Children and Adolescents (MUSICA). The MUSICA postulates that youngsters experiment with smoking while they are in an unmotivated state as regards their plans for smoking regularly in the future. More than 95% of the total population resided in one of the seven stages distinguished by MUSICA. The probability of starting to smoke regularly during the 12 months follow-up period increased with advanced stage assignment at baseline. Unique social cognitive predictors of stage progression from the various stages were identified, but effect sizes of predictors of transitions were small. The integration of motivational and behavioral dimensions improves our understanding of the process of smoking initiation. In contrast to current theories of smoking initiation, adolescent uptake of smoking behavior was found to be an unplanned action.
Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices.
Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang
2016-01-01
Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information.
Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices
Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang
2016-01-01
Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information. PMID:27907188
Wannige, C T; Kulasiri, D; Samarasinghe, S
2014-01-21
Nutrients from living environment are vital for the survival and growth of any organism. Budding yeast diploid cells decide to grow by mitosis type cell division or decide to create unique, stress resistant spores by meiosis type cell division depending on the available nutrient conditions. To gain a molecular systems level understanding of the nutrient dependant switching between meiosis and mitosis initiation in diploid cells of budding yeast, we develop a theoretical model based on ordinary differential equations (ODEs) including the mitosis initiator and its relations to budding yeast meiosis initiation network. Our model accurately and qualitatively predicts the experimentally revealed temporal variations of related proteins under different nutrient conditions as well as the diverse mutant studies related to meiosis and mitosis initiation. Using this model, we show how the meiosis and mitosis initiators form an all-or-none type bistable switch in response to available nutrient level (mainly nitrogen). The transitions to and from meiosis or mitosis initiation states occur via saddle node bifurcation. This bidirectional switch helps the optimal usage of available nutrients and explains the mutually exclusive existence of meiosis and mitosis pathways. © 2013 Elsevier Ltd. All rights reserved.
Systems Analysis Initiated for All-Electric Aircraft Propulsion
NASA Technical Reports Server (NTRS)
Kohout, Lisa L.
2003-01-01
A multidisciplinary effort is underway at the NASA Glenn Research Center to develop concepts for revolutionary, nontraditional fuel cell power and propulsion systems for aircraft applications. There is a growing interest in the use of fuel cells as a power source for electric propulsion as well as an auxiliary power unit to substantially reduce or eliminate environmentally harmful emissions. A systems analysis effort was initiated to assess potential concepts in an effort to identify those configurations with the highest payoff potential. Among the technologies under consideration are advanced proton exchange membrane (PEM) and solid oxide fuel cells, alternative fuels and fuel processing, and fuel storage. Prior to this effort, the majority of fuel cell analysis done at Glenn was done for space applications. Because of this, a new suite of models was developed. These models include the hydrogen-air PEM fuel cell; internal reforming solid oxide fuel cell; balance-of-plant components (compressor, humidifier, separator, and heat exchangers); compressed gas, cryogenic, and liquid fuel storage tanks; and gas turbine/generator models for hybrid system applications. Initial mass, volume, and performance estimates of a variety of PEM systems operating on hydrogen and reformate have been completed for a baseline general aviation aircraft. Solid oxide/turbine hybrid systems are being analyzed. In conjunction with the analysis efforts, a joint effort has been initiated with Glenn s Computer Services Division to integrate fuel cell stack and component models with the visualization environment that supports the GRUVE lab, Glenn s virtual reality facility. The objective of this work is to provide an environment to assist engineers in the integration of fuel cell propulsion systems into aircraft and provide a better understanding of the interaction between system components and the resulting effect on the overall design and performance of the aircraft. Initially, three-dimensional computer-aided design (CAD) models of representative PEM fuel cell stack and components were developed and integrated into the virtual reality environment along with an Excel-based model used to calculate fuel cell electrical performance on the basis of cell dimensions (see the figure). CAD models of a representative general aviation aircraft were also developed and added to the environment. With the use of special headgear, users will be able to virtually manipulate the fuel cell s physical characteristics and its placement within the aircraft while receiving information on the resultant fuel cell output power and performance. As the systems analysis effort progresses, we will add more component models to the GRUVE environment to help us more fully understand the effect of various system configurations on the aircraft.
Ghosh, J; Wilson, R W; Kudoh, T
2009-12-01
The normal embryonic development of the tomato clownfish Amphiprion frenatus was analysed using live imaging and by in situ hybridization for detection of mesodermal and neurectodermal development. Both morphology of live embryos and tissue-specific staining revealed significant differences in the gross developmental programme of A. frenatus compared with better-known teleost fish models, in particular, initiation of somitogenesis before complete epiboly, initiation of narrowing of the neurectoderm (neurulation) before somitogenesis, relatively early pigmentation of melanophores at the 10-15 somite stage and a distinctive pattern of melanophore distribution. These results suggest evolutionary adaptability of the teleost developmental programme. The ease of obtaining eggs, in vitro culture of the embryo, in situ staining analyses and these reported characteristics make A. frenatus a potentially important model marine fish species for studying embryonic development, physiology, ecology and evolution.
van der Wegen, M.; Dastgheib, A.; Jaffe, B.E.; Roelvink, D.
2011-01-01
Applications of process-based morphodynamic models are often constrained by limited availability of data on bed composition, which may have a considerable impact on the modeled morphodynamic development. One may even distinguish a period of "morphodynamic spin-up" in which the model generates the bed level according to some ill-defined initial bed composition rather than describing the realistic behavior of the system. The present paper proposes a methodology to generate bed composition of multiple sand and/or mud fractions that can act as the initial condition for the process-based numerical model Delft3D. The bed composition generation (BCG) run does not include bed level changes, but does permit the redistribution of multiple sediment fractions over the modeled domain. The model applies the concept of an active layer that may differ in sediment composition above an underlayer with fixed composition. In the case of a BCG run, the bed level is kept constant, whereas the bed composition can change. The approach is applied to San Pablo Bay in California, USA. Model results show that the BCG run reallocates sand and mud fractions over the model domain. Initially, a major sediment reallocation takes place, but development rates decrease in the longer term. Runs that take the outcome of a BCG run as a starting point lead to more gradual morphodynamic development. Sensitivity analysis shows the impact of variations in the morphological factor, the active layer thickness, and wind waves. An important but difficult to characterize criterion for a successful application of a BCG run is that it should not lead to a bed composition that fixes the bed so that it dominates the "natural" morphodynamic development of the system. Future research will focus on a decadal morphodynamic hindcast and comparison with measured bathymetries in San Pablo Bay so that the proposed methodology can be tested and optimized. ?? 2010 The Author(s).
Wang, Han-I; Aas, Eline; Howell, Debra; Roman, Eve; Patmore, Russell; Jack, Andrew; Smith, Alexandra
2014-03-01
Acute myeloid leukemia (AML) can be diagnosed at any age and treatment, which can be given with supportive and/or curative intent, is considered expensive compared with that for other cancers. Despite this, no long-term predictive models have been developed for AML, mainly because of the complexities associated with this disease. The objective of the current study was to develop a model (based on a UK cohort) to predict cost and life expectancy at a population level. The model developed in this study combined a decision tree with several Markov models to reflect the complexity of the prognostic factors and treatments of AML. The model was simulated with a cycle length of 1 month for a time period of 5 years and further simulated until age 100 years or death. Results were compared for two age groups and five different initial treatment intents and responses. Transition probabilities, life expectancies, and costs were derived from a UK population-based specialist registry-the Haematological Malignancy Research Network (www.hmrn.org). Overall, expected 5-year medical costs and life expectancy ranged from £8,170 to £81,636 and 3.03 to 34.74 months, respectively. The economic and health outcomes varied with initial treatment intent, age at diagnosis, trial participation, and study time horizon. The model was validated by using face, internal, and external validation methods. The results show that the model captured more than 90% of the empirical costs, and it demonstrated good fit with the empirical overall survival. Costs and life expectancy of AML varied with patient characteristics and initial treatment intent. The robust AML model developed in this study could be used to evaluate new diagnostic tools/treatments, as well as enable policy makers to make informed decisions. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2010-02-26
In anticipation of developing pavement performance models as part of a proposed pavement management : system, the Pennsylvania Department of Transportation (PennDOT) initiated a study in 2009 to investigate : performance modeling activities and condi...
A MODEL FOR FINE PARTICLE AGGLOMERATION IN CIRCULATING FLUIDIZED BED ABSORBERS
A model for fine particle agglomeration in circulating fluidized bed absorbers (CFBAS) has been developed. It can model the influence of different factors on agglomeration, such as the geometry of CFBAs, superficial gas velocity, initial particle size distribution, and type of ag...
PESTICIDE ORCHARD ECOSYSTEM MODEL (POEM): A USER'S GUIDE
A mathematical model was developed to predict the transport and effects of a pesticide in an orchard ecosystem. The environmental behavior of azinphosmethyl was studied over a two-year period in a Michigan apple orchard. Data were gathered for the model on initial distribution wi...
Ma, Jun; Liu, Lei; Ge, Sai; Xue, Qiang; Li, Jiangshan; Wan, Yong; Hui, Xinminnan
2018-03-01
A quantitative description of aerobic waste degradation is important in evaluating landfill waste stability and economic management. This research aimed to develop a coupling model to predict the degree of aerobic waste degradation. On the basis of the first-order kinetic equation and the law of conservation of mass, we first developed the coupling model of aerobic waste degradation that considered temperature, initial moisture content and air injection volume to simulate and predict the chemical oxygen demand in the leachate. Three different laboratory experiments on aerobic waste degradation were simulated to test the model applicability. Parameter sensitivity analyses were conducted to evaluate the reliability of parameters. The coupling model can simulate aerobic waste degradation, and the obtained simulation agreed with the corresponding results of the experiment. Comparison of the experiment and simulation demonstrated that the coupling model is a new approach to predict aerobic waste degradation and can be considered as the basis for selecting the economic air injection volume and appropriate management in the future.
NASA Astrophysics Data System (ADS)
Estima, Jacinto Paulo Simoes
Traditional geographic information has been produced by mapping agencies and corporations, using high skilled people as well as expensive precision equipment and procedures, in a very costly approach. The production of land use and land cover databases are just one example of such traditional approach. On the other side, The amount of Geographic Information created and shared by citizens through the Web has been increasing exponentially during the last decade, resulting from the emergence and popularization of technologies such as the Web 2.0, cloud computing, GPS, smart phones, among others. Such comprehensive amount of free geographic data might have valuable information to extract and thus opening great possibilities to improve significantly the production of land use and land cover databases. In this thesis we explored the feasibility of using geographic data from different user generated spatial content initiatives in the process of land use and land cover database production. Data from Panoramio, Flickr and OpenStreetMap were explored in terms of their spatial and temporal distribution, and their distribution over the different land use and land cover classes. We then proposed a conceptual model to integrate data from suitable user generated spatial content initiatives based on identified dissimilarities among a comprehensive list of initiatives. Finally we developed a prototype implementing the proposed integration model, which was then validated by using the prototype to solve four identified use cases. We concluded that data from user generated spatial content initiatives has great value but should be integrated to increase their potential. The possibility of integrating data from such initiatives in an integration model was proved. Using the developed prototype, the relevance of the integration model was also demonstrated for different use cases. None None None
Goethe and the ABC model of flower development.
Coen, E
2001-06-01
About 10 years ago, the ABC model for the genetic control of flower development was proposed. This model was initially based on the analysis of mutant flowers but has subsequently been confirmed by molecular analysis. This paper describes the 200-year history behind this model, from the late 18th century when Goethe arrived at his idea of plant metamorphosis, to the genetic studies on flower mutants carried out on Arabidopsis and Antirrhinum in the late 20th century.
An adverse outcome pathway (AOP) conceptually links a molecular initiating event with measureable key events at higher levels of biological organization that ultimately result in an adverse outcome. Development of an AOP requires experimental data and scientific expertise to ide...
A Statewide System To Track Medical Students' Careers: The Pennsylvania Model.
ERIC Educational Resources Information Center
Rabinowitz, Howard K.; Veloski, J. Jon; Aber, Robert C.; Adler, Sheldon; Ferretti, Sylvia M.; Kelliher, Gerald J.; Mochen, Eugene; Morrison, Gail; Rattner, Susan L.; Sterling, Gerald; Robeson, Mary R.; Hojat, Mohammadreza; Xu, Gang
1999-01-01
Pennsylvania developed a generalist physician initiative, inspired by that of the Robert Wood Johnson Foundation, initiating a longitudinal tracking system at six allopathic and two osteopathic medical schools to follow students from matriculation into professional careers. The statewide database includes information on over 18,000 students,…
EXPERIMENTAL AND MODELING SUPPORT TO OPPT AND ORIA UNDER THE "BUY CLEAN" INITIATIVE
Under the Buy Clean program, EPA is responsible for developing and disseminating guidance that will assist users procure lower-emitting products and devices that aim to reduce the exposure of indoor occupants' exposure. The initial focus has been to assist school personnel minimi...
How Important Is Child-Initiated Activity?
ERIC Educational Resources Information Center
Schweinhart, Lawrence J.
1988-01-01
Recent research (including a longitudinal study comparing direct instruction, nursery school, and High/Scope models) has shown that the child-initiated approach outstrips teacher-directed activity in fostering some aspects of social development. Adolescents from the High/Scope and nursery school groups had half as many delinquencies as the direct…
The Association of Family Influence and Initial Interest in Science
ERIC Educational Resources Information Center
Dabney, Katherine P.; Chakraverty, Devasmita; Tai, Robert H.
2013-01-01
With recent attention to improving scientific workforce development and student achievement, there has been a rise in effort to understand and encourage student engagement in physical science. This study examines the association of family influence and initial interest in science through multiple and logistic regression models. Research questions…
Using Geographic Information Systems to Evaluate Energy Initiatives in Austere Environments
2013-03-01
conducting economic analysis of energy reduction initiatives. This research examined the energy savings potential of improving the thermal properties...shelter improvements in any climate and location in the world. Specifically, solar flies developed through Solar Integrated Power Shelter System...94 Improvements to the Existing Model
Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H
2017-07-10
Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.
Review of Initiatives for Increasing Enlisted Reenlistment in the U.S. Army
2009-11-01
topics Phase 4: Career Development Annual follow- up (BASD anniversary) Career Counselor and Soldier Developmental counseling Eligibility for... Young 14. ABSTRACT The U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) initiated a three-year research program titled "STAY...Strategies to Enhance Retention." The goals of this effort were to develop and test a conceptual model of the career continuance process, and to
Latifi, Rifat; Merrell, Ronald C; Doarn, Charles R; Hadeed, George J; Bekteshi, Flamur; Lecaj, Ismet; Boucha, Kathe; Hajdari, Fatmir; Hoxha, Astrit; Koshi, Dashurije; de Leonni Stanonik, Mateja; Berisha, Blerim; Novoberdaliu, Kadri; Imeri, Arben; Weinstein, Ronald S
2009-12-01
Establishing sustainable telemedicine has become a goal of many developing countries around the world. Yet, despite initiatives from a select few individuals and on occasion from various governments, often these initiatives never mature to become sustainable programs. The introduction of telemedicine and e-learning in Kosova has been a pivotal step in advancing the quality and availability of medical services in a region whose infrastructure and resources have been decimated by wars, neglect, lack of funding, and poor management. The concept and establishment of the International Virtual e-Hospital (IVeH) has significantly impacted telemedicine and e-health services in the Balkans. The success of the IVeH in Kosova has led to the development of similar programs in other Balkan countries and other developing countries in the hope of modernizing and improving their healthcare infrastructure. A comprehensive, four-pronged strategy, "Initiate-Build-Operate-Transfer" (IBOT), may be a useful approach in establishing telemedicine and e-health educational services in developing countries. The development strategy, IBOT, used by the IVeH to establish and develop telemedicine programs, was discussed. IBOT includes assessment of healthcare needs of each country, the development of a curriculum and education program, the establishment of a nationwide telemedicine network, and the integration of the telemedicine program into the healthcare infrastructure. The endpoint is the transfer of a sustainable telehealth program to the nation involved. By applying IBOT, a sustainable telemedicine program of Kosova has been established as an effective prototype for telemedicine in the Balkans. Once fully matured, the program will be transitioned to the national Ministry of Health, which ensures the sustainability and ownership of the program. Similar programs are being established in Albania, Macedonia, and other countries around the world. The IBOT model has been effective in creating sustainable telemedicine and e-health integrated programs in the Balkans and may be a good model for establishing such programs in developing countries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenbaum, Ralph K.; Bachmann, Till M.; Swirsky Gold, Lois
2008-02-03
Background, Aim and Scope. In 2005 a comprehensive comparison of LCIA toxicity characterisation models was initiated by the UNEP-SETAC Life Cycle Initiative, directly involving the model developers of CalTOX, IMPACT 2002, USES-LCA, BETR, EDIP, WATSON, and EcoSense. In this paper we describe this model-comparison process and its results--in particular the scientific consensus model developed by the model developers. The main objectives of this effort were (i) to identify specific sources of differences between the models' results and structure, (ii) to detect the indispensable model components, and (iii) to build a scientific consensus model from them, representing recommended practice. Methods. Amore » chemical test set of 45 organics covering a wide range of property combinations was selected for this purpose. All models used this set. In three workshops, the model comparison participants identified key fate, exposure and effect issues via comparison of the final characterisation factors and selected intermediate outputs for fate, human exposure and toxic effects for the test set applied to all models. Results. Through this process, we were able to reduce inter-model variation from an initial range of up to 13 orders of magnitude down to no more than 2 orders of magnitude for any substance. This led to the development of USEtox, a scientific consensus model that contains only the most influential model elements. These were, for example, process formulations accounting for intermittent rain, defining a closed or open system environment, or nesting an urban box in a continental box. Discussion. The precision of the new characterisation factors (CFs) is within a factor of 100-1000 for human health and 10-100 for freshwater ecotoxicity of all other models compared to 12 orders of magnitude variation between the CFs of each model respectively. The achieved reduction of inter-model variability by up to 11 orders of magnitude is a significant improvement.Conclusions. USEtox provides a parsimonious and transparent tool for human health and ecosystem CF estimates. Based on a referenced database, it has now been used to calculate CFs for several thousand substances and forms the basis of the recommendations from UNEP-SETAC's Life Cycle Initiative regarding characterization of toxic impacts in Life Cycle Assessment. Recommendations and Perspectives. We provide both recommended and interim (not recommended and to be used with caution) characterisation factors for human health and freshwater ecotoxicity impacts. After a process of consensus building among stakeholders on a broad scale as well as several improvements regarding a wider and easier applicability of the model, USEtox will become available to practitioners for the calculation of further CFs.« less
Holz, Frank G; Korobelnik, Jean-François; Lanzetta, Paolo; Mitchell, Paul; Schmidt-Erfurth, Ursula; Wolf, Sebastian; Markabi, Sabri; Schmidli, Heinz; Weichselberger, Andreas
2010-01-01
Differences in treatment responses to ranibizumab injections observed within trials involving monthly (MARINA and ANCHOR studies) and quarterly (PIER study) treatment suggest that an individualized treatment regimen may be effective in neovascular age-related macular degeneration. In the present study, a drug and disease model was used to evaluate the impact of an individualized, flexible treatment regimen on disease progression. For visual acuity (VA), a model was developed on the 12-month data from ANCHOR, MARINA, and PIER. Data from untreated patients were used to model patient-specific disease progression in terms of VA loss. Data from treated patients from the period after the three initial injections were used to model the effect of predicted ranibizumab vitreous concentration on VA loss. The model was checked by comparing simulations of VA outcomes after monthly and quarterly injections during this period with trial data. A flexible VA-guided regimen (after the three initial injections) in which treatment is initiated by loss of >5 letters from best previously observed VA scores was simulated. Simulated monthly and quarterly VA-guided regimens showed good agreement with trial data. Simulation of VA-driven individualized treatment suggests that this regimen, on average, sustains the initial gains in VA seen in clinical trials at month 3. The model predicted that, on average, to maintain initial VA gains, an estimated 5.1 ranibizumab injections are needed during the 9 months after the three initial monthly injections, which amounts to a total of 8.1 injections during the first year. A flexible, individualized VA-guided regimen after the three initial injections may sustain vision improvement with ranibizumab and could improve cost-effectiveness and convenience and reduce drug administration-associated risks.
Seng, Bunrith; Kaneko, Hidehiro; Hirayama, Kimiaki; Katayama-Hirayama, Keiko
2012-01-01
This paper presents a mathematical model of vertical water movement and a performance evaluation of the model in static pile composting operated with neither air supply nor turning. The vertical moisture content (MC) model was developed with consideration of evaporation (internal and external evaporation), diffusion (liquid and vapour diffusion) and percolation, whereas additional water from substrate decomposition and irrigation was not taken into account. The evaporation term in the model was established on the basis of reference evaporation of the materials at known temperature, MC and relative humidity of the air. Diffusion of water vapour was estimated as functions of relative humidity and temperature, whereas diffusion of liquid water was empirically obtained from experiment by adopting Fick's law. Percolation was estimated by following Darcy's law. The model was applied to a column of composting wood chips with an initial MC of 60%. The simulation program was run for four weeks with calculation span of 1 s. The simulated results were in reasonably good agreement with the experimental results. Only a top layer (less than 20 cm) had a considerable MC reduction; the deeper layers were comparable to the initial MC, and the bottom layer was higher than the initial MC. This model is a useful tool to estimate the MC profile throughout the composting period, and could be incorporated into biodegradation kinetic simulation of composting.
Competency Development and Career Success: The Mediating Role of Employability
ERIC Educational Resources Information Center
De Vos, Ans; De Hauw, Sara; Van der Heijden, Beatrice I. J. M.
2011-01-01
The present study aims to unravel the relationship between competency development, employability and career success. To do so, we tested a model wherein associations between employee participation in competency development initiatives, perceived support for competency development, self-perceived employability, and two indicators of subjective…
A risk assessment method for multi-site damage
NASA Astrophysics Data System (ADS)
Millwater, Harry Russell, Jr.
This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.
National facilities study. Volume 3: Mission and requirements model report
NASA Technical Reports Server (NTRS)
1994-01-01
The National Facility Study (NFS) was initiated in 1992 by Daniel S. Goldin, Administrator of NASA as an initiative to develop a comprehensive and integrated long-term plan for future facilities. The resulting, multi-agency NFS consisted of three Task Groups: Aeronautics, Space Operations, and Space Research and Development (R&D) Task Groups. A fourth group, the Engineering and Cost Analysis Task Group, was subsequently added to provide cross-cutting functions, such as assuring consistency in developing an inventory of space facilities. Space facilities decisions require an assessment of current and future needs. Therefore, the two task groups dealing with space developed a consistent model of future space mission programs, operations and R&D. The model is a middle ground baseline constructed for NFS analytical purposes with excursions to cover potential space program strategies. The model includes three major sectors: DOD, civilian government, and commercial space. The model spans the next 30 years because of the long lead times associated with facilities development and usage. This document, Volume 3 of the final NFS report, is organized along the following lines: Executive Summary -- provides a summary view of the 30-year mission forecast and requirements baseline, an overview of excursions from that baseline that were studied, and organization of the report; Introduction -- provides discussions of the methodology used in this analysis; Baseline Model -- provides the mission and requirements model baseline developed for Space Operations and Space R&D analyses; Excursions from the baseline -- reviews the details of variations or 'excursions' that were developed to test the future program projections captured in the baseline; and a Glossary of Acronyms.
Jordanian Pre-Service Teachers' and Technology Integration: A Human Resource Development Approach
ERIC Educational Resources Information Center
Al-Ruz, Jamal Abu; Khasawneh, Samer
2011-01-01
The purpose of this study was to test a model in which technology integration of pre-service teachers was predicted by a number of university-based and school-based factors. Initially, factors affecting technology integration were identified, and a research-based path model was developed to explain causal relationships between these factors. The…
Leading Change: A Case Study of Alamo Academies--An Industry-Driven Workforce Partnership Program
ERIC Educational Resources Information Center
Hu, Xiaodan; Bowman, Gene
2016-01-01
In this study, the authors focus on the initiation and development of the Alamo Academies, aiming to illustrate an exemplary industry-driven model that addresses workforce development in local community. After a brief introduction of the context, the authors summarized major factors that contribute to the success of the collaboration model,…
Leadership Education Is Not Enough: Advancing an Integrated Model of Student-Athlete Development
ERIC Educational Resources Information Center
DiPaolo, Donald G.
2017-01-01
This article advocates a new approach to how we work with the millions of student-athletes in schools by examining a more holistic model of player development. Rather than assisting students in separate silos and initiatives, the argument is made for integrating the areas of leadership education, performance psychology, and personal development…
Staff Study on Cost and Training Effectiveness of Proposed Training Systems. TAEG Report 1.
ERIC Educational Resources Information Center
Naval Training Equipment Center, Orlando, FL. Training Analysis and Evaluation Group.
A study began the development and initial testing of a method for predicting cost and training effectiveness of proposed training programs. A prototype Training Effectiveness and Cost Effectiveness Prediction (TECEP) model was developed and tested. The model was a method for optimization of training media allocation on the basis of fixed training…
ERIC Educational Resources Information Center
Deemer, Eric D.; Martens, Matthew P.; Buboltz, Walter C.
2010-01-01
An instrument designed to measure a 3-factor model of research motivation was developed and psychometrically examined in the present research. Participants were 437 graduate students in biology, chemistry/biochemistry, physics/astronomy, and psychology. A principal components analysis supported the retention of 20 items representing the 3-factor…
ERIC Educational Resources Information Center
Hammond, Cathy; Drew, Sam F.; Withington, Cairen; Griffith, Cathy; Swiger, Caroline M.; Mobley, Catherine; Sharp, Julia L.; Stringfield, Samuel C.; Stipanovic, Natalie; Daugherty, Lindsay
2013-01-01
This executive summary outlines key findings from the final technical report of a five-year study of South Carolina's Personal Pathways to Success Initiative, which was authorized by the state's Education and Economic Development Act (EEDA) in 2005. The Personal Pathways initiative is a K-16, career-focused school reform model intended to improve…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, R.
This report documents the initial progress on the reduced-order flow model developments in SAM for thermal stratification and mixing modeling. Two different modeling approaches are pursued. The first one is based on one-dimensional fluid equations with additional terms accounting for the thermal mixing from both flow circulations and turbulent mixing. The second approach is based on three-dimensional coarse-grid CFD approach, in which the full three-dimensional fluid conservation equations are modeled with closure models to account for the effects of turbulence.
ERIC Educational Resources Information Center
Powell, Justin J. W.; Bernhard, Nadine; Graf, Lukas
2012-01-01
Proposing an alternative to the American model, intergovernmental reform initiatives in Europe have developed and promote a comprehensive European model of skill formation. What ideals, standards, and governance are proposed in this new pan-European model? This model responds to heightened global competition among "knowledge societies"…
Teaching Mathematical Modelling: Demonstrating Enrichment and Elaboration
ERIC Educational Resources Information Center
Warwick, Jon
2015-01-01
This paper uses a series of models to illustrate one of the fundamental processes of model building--that of enrichment and elaboration. The paper describes how a problem context is given which allows a series of models to be developed from a simple initial model using a queuing theory framework. The process encourages students to think about the…
Colon Cancer Tumorigenesis Initiated by the H1047R Mutant PI3K.
Yueh, Alexander E; Payne, Susan N; Leystra, Alyssa A; Van De Hey, Dana R; Foley, Tyler M; Pasch, Cheri A; Clipson, Linda; Matkowskyj, Kristina A; Deming, Dustin A
2016-01-01
The phosphoinositide 3-kinase (PI3K) signaling pathway is critical for multiple important cellular functions, and is one of the most commonly altered pathways in human cancers. We previously developed a mouse model in which colon cancers were initiated by a dominant active PI3K p110-p85 fusion protein. In that model, well-differentiated mucinous adenocarcinomas developed within the colon and initiated through a non-canonical mechanism that is not dependent on WNT signaling. To assess the potential relevance of PI3K mutations in human cancers, we sought to determine if one of the common mutations in the human disease could also initiate similar colon cancers. Mice were generated expressing the Pik3caH1047R mutation, the analog of one of three human hotspot mutations in this gene. Mice expressing a constitutively active PI3K, as a result of this mutation, develop invasive adenocarcinomas strikingly similar to invasive adenocarcinomas found in human colon cancers. These tumors form without a polypoid intermediary and also lack nuclear CTNNB1 (β-catenin), indicating a non-canonical mechanism of tumor initiation mediated by the PI3K pathway. These cancers are sensitive to dual PI3K/mTOR inhibition indicating dependence on the PI3K pathway. The tumor tissue remaining after treatment demonstrated reduction in cellular proliferation and inhibition of PI3K signaling.
Colon Cancer Tumorigenesis Initiated by the H1047R Mutant PI3K
Yueh, Alexander E.; Payne, Susan N.; Leystra, Alyssa A.; Van De Hey, Dana R.; Foley, Tyler M.; Pasch, Cheri A.; Clipson, Linda; Matkowskyj, Kristina A.; Deming, Dustin A.
2016-01-01
The phosphoinositide 3-kinase (PI3K) signaling pathway is critical for multiple important cellular functions, and is one of the most commonly altered pathways in human cancers. We previously developed a mouse model in which colon cancers were initiated by a dominant active PI3K p110-p85 fusion protein. In that model, well-differentiated mucinous adenocarcinomas developed within the colon and initiated through a non-canonical mechanism that is not dependent on WNT signaling. To assess the potential relevance of PI3K mutations in human cancers, we sought to determine if one of the common mutations in the human disease could also initiate similar colon cancers. Mice were generated expressing the Pik3caH1047R mutation, the analog of one of three human hotspot mutations in this gene. Mice expressing a constitutively active PI3K, as a result of this mutation, develop invasive adenocarcinomas strikingly similar to invasive adenocarcinomas found in human colon cancers. These tumors form without a polypoid intermediary and also lack nuclear CTNNB1 (β-catenin), indicating a non-canonical mechanism of tumor initiation mediated by the PI3K pathway. These cancers are sensitive to dual PI3K/mTOR inhibition indicating dependence on the PI3K pathway. The tumor tissue remaining after treatment demonstrated reduction in cellular proliferation and inhibition of PI3K signaling. PMID:26863299
Long-Boyle, Janel; Savic, Rada; Yan, Shirley; Bartelink, Imke; Musick, Lisa; French, Deborah; Law, Jason; Horn, Biljana; Cowan, Morton J.; Dvorak, Christopher C.
2014-01-01
Background Population pharmacokinetic (PK) studies of busulfan in children have shown that individualized model-based algorithms provide improved targeted busulfan therapy when compared to conventional dosing. The adoption of population PK models into routine clinical practice has been hampered by the tendency of pharmacologists to develop complex models too impractical for clinicians to use. The authors aimed to develop a population PK model for busulfan in children that can reliably achieve therapeutic exposure (concentration-at-steady-state, Css) and implement a simple, model-based tool for the initial dosing of busulfan in children undergoing HCT. Patients and Methods Model development was conducted using retrospective data available in 90 pediatric and young adult patients who had undergone HCT with busulfan conditioning. Busulfan drug levels and potential covariates influencing drug exposure were analyzed using the non-linear mixed effects modeling software, NONMEM. The final population PK model was implemented into a clinician-friendly, Microsoft Excel-based tool and used to recommend initial doses of busulfan in a group of 21 pediatric patients prospectively dosed based on the population PK model. Results Modeling of busulfan time-concentration data indicates busulfan CL displays non-linearity in children, decreasing up to approximately 20% between the concentrations of 250–2000 ng/mL. Important patient-specific covariates found to significantly impact busulfan CL were actual body weight and age. The percentage of individuals achieving a therapeutic Css was significantly higher in subjects receiving initial doses based on the population PK model (81%) versus historical controls dosed on conventional guidelines (52%) (p = 0.02). Conclusion When compared to the conventional dosing guidelines, the model-based algorithm demonstrates significant improvement for providing targeted busulfan therapy in children and young adults. PMID:25162216
DOT National Transportation Integrated Search
2006-06-01
The New Hampshire Department of Transportation initiated this research to develop a geographical information system (GIS) that : visualizes subsurface conditions three dimensionally by pulling together geotechnical data containing spatial references....
The WRF-CMAQ Integrated On-Line Modeling System: Development, Testing, and Initial Applications
Traditionally, atmospheric chemistry-transport and meteorology models have been applied in an off-line paradigm, in which archived output on the dynamical state of the atmosphere simulated using the meteorology model is used to drive transport and chemistry calculations of atmos...
Mathematical modeling of a Ti:sapphire solid-state laser
NASA Technical Reports Server (NTRS)
Swetits, John J.
1987-01-01
The project initiated a study of a mathematical model of a tunable Ti:sapphire solid-state laser. A general mathematical model was developed for the purpose of identifying design parameters which will optimize the system, and serve as a useful predictor of the system's behavior.
More than two decades of Apc modeling in rodents
Zeineldin, Maged; Neufeld, Kristi L.
2013-01-01
Mutation of tumor suppressor gene Adenomatous polyposis coli (APC) is an initiating step in most colon cancers. This review summarizes Apc models in mice and rats, with particular concentration on those most recently developed, phenotypic variation among different models, and genotype/ phenotype correlations. PMID:23333833
An important challenge for an integrative approach to developmental systems toxicology is associating putative molecular initiating events (MIEs), cell signaling pathways, cell function and modeled fetal exposure kinetics. We have developed a chemical classification model based o...
Global and regional ecosystem modeling: comparison of model outputs and field measurements
NASA Astrophysics Data System (ADS)
Olson, R. J.; Hibbard, K.
2003-04-01
The Ecosystem Model-Data Intercomparison (EMDI) Workshops provide a venue for global ecosystem modeling groups to compare model outputs against measurements of net primary productivity (NPP). The objective of EMDI Workshops is to evaluate model performance relative to observations in order to improve confidence in global model projections terrestrial carbon cycling. The questions addressed by EMDI include: How does the simulated NPP compare with the field data across biome and environmental gradients? How sensitive are models to site-specific climate? Does additional mechanistic detail in models result in a better match with field measurements? How useful are the measures of NPP for evaluating model predictions? How well do models represent regional patterns of NPP? Initial EMDI results showed general agreement between model predictions and field measurements but with obvious differences that indicated areas for potential data and model improvement. The effort was built on the development and compilation of complete and consistent databases for model initialization and comparison. Database development improves the data as well as models; however, there is a need to incorporate additional observations and model outputs (LAI, hydrology, etc.) for comprehensive analyses of biogeochemical processes and their relationships to ecosystem structure and function. EMDI initialization and NPP data sets are available from the Oak Ridge National Laboratory Distributed Active Archive Center http://www.daac.ornl.gov/. Acknowledgements: This work was partially supported by the International Geosphere-Biosphere Programme - Data and Information System (IGBP-DIS); the IGBP-Global Analysis, Interpretation and Modelling Task Force (GAIM); the National Center for Ecological Analysis and Synthesis (NCEAS); and the National Aeronautics and Space Administration (NASA) Terrestrial Ecosystem Program. Oak Ridge National Laboratory is managed by UT-Battelle LLC for the U.S. Department of Energy under contract DE-AC05-00OR22725
Ganguly, Mohit; Miller, Stephanie; Mitra, Kunal
2015-11-01
Short pulse lasers with pulse durations in the range of nanoseconds and shorter are effective in the targeted delivery of heat energy for precise tissue heating and ablation. This photothermal therapy is useful where the removal of cancerous tissue sections is required. The objective of this paper is to use finite element modeling to demonstrate the differences in the thermal response of skin tissue to short-pulse and continuous wave laser irradiation in the initial stages of the irradiation. Models have been developed to validate the temperature distribution and heat affected zone during laser irradiation of excised rat skin samples and live anesthetized mouse tissue. Excised rat skin samples and live anesthetized mice were subjected to Nd:YAG pulsed laser (1,064 nm, 500 ns) irradiation of varying powers. A thermal camera was used to measure the rise in surface temperature as a result of the laser irradiation. Histological analyses of the heat affected zone created in the tissue samples due to the temperature rise were performed. The thermal interaction of the laser with the tissue was quantified by measuring the thermal dose delivered by the laser. Finite element geometries of three-dimensional tissue sections for continuum and vascular models were developed using COMSOL Multiphysics. Blood flow was incorporated into the vascular model to mimic the presence of discrete blood vessels and contrasted with the continuum model without blood perfusion. The temperature rises predicted by the continuum and the vascular models agreed with the temperature rises observed at the surface of the excised rat tissue samples and live anesthetized mice due to laser irradiation respectively. The vascular model developed was able to predict the cooling produced by the blood vessels in the region where the vessels were present. The temperature rise in the continuum model due to pulsed laser irradiation was higher than that due to continuous wave (CW) laser irradiation in the initial stages of the irradiation. The temperature rise due to pulsed and CW laser irradiation converged as the time of irradiation increased. A similar trend was observed when comparing the thermal dose for pulsed and CW laser irradiation in the vascular model. Finite element models (continuum and vascular) were developed that can be used to predict temperature rise and quantify the thermal dose resulting from laser irradiation of excised rat skin samples and live anesthetized mouse tissue. The vascular model incorporating blood perfusion effects predicted temperature rise better in the live animal tissue. The models developed demonstrated that pulsed lasers caused greater temperature rise and delivered a greater thermal dose than CW lasers of equal average power, especially during the initial transients of irradiation. This analysis will be beneficial for thermal therapy applications where maximum delivery of thermal dose over a short period of time is important. © 2015 Wiley Periodicals, Inc.
Testing and modeling of PBX-9591 shock initiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lam, Kim; Foley, Timothy; Novak, Alan
2010-01-01
This paper describes an ongoing effort to develop a detonation sensitivity test for PBX-9501 that is suitable for studying pristine and damaged HE. The approach involves testing and comparing the sensitivities of HE pressed to various densities and those of pre-damaged samples with similar porosities. The ultimate objectives are to understand the response of pre-damaged HE to shock impacts and to develop practical computational models for use in system analysis codes for HE safety studies. Computer simulation with the CTH shock physics code is used to aid the experimental design and analyze the test results. In the calculations, initiation andmore » growth or failure of detonation are modeled with the empirical HVRB model. The historical LANL SSGT and LSGT were reviewed and it was determined that a new, modified gap test be developed to satisfy the current requirements. In the new test, the donor/spacer/acceptor assembly is placed in a holder that is designed to work with fixtures for pre-damaging the acceptor sample. CTH simulations were made of the gap test with PBX-9501 samples pressed to three different densities. The calculated sensitivities were validated by test observations. The agreement between the computed and experimental critical gap thicknesses, ranging from 9 to 21 mm under various test conditions, is well within 1 mm. These results show that the numerical modeling is a valuable complement to the experimental efforts in studying and understanding shock initiation of PBX-9501.« less
Large area sheet task: Advanced dendritic web growth development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.
1981-01-01
The growth of silicon dendritic web for photovoltaic applications was investigated. The application of a thermal model for calculating buckling stresses as a function of temperature profile in the web is discussed. Lid and shield concepts were evaluated to provide the data base for enhancing growth velocity. An experimental web growth machine which embodies in one unit the mechanical and electronic features developed in previous work was developed. In addition, evaluation of a melt level control system was begun, along with preliminary tests of an elongated crucible design. The economic analysis was also updated to incorporate some minor cost changes. The initial applications of the thermal model to a specific configuration gave results consistent with experimental observation in terms of the initiation of buckling vs. width for a given crystal thickness.
No Future in the Past? The role of initial topography on landform evolution model predictions
NASA Astrophysics Data System (ADS)
Hancock, G. R.; Coulthard, T. J.; Lowry, J.
2014-12-01
Our understanding of earth surface processes is based on long-term empirical understandings, short-term field measurements as well as numerical models. In particular, numerical landscape evolution models (LEMs) have been developed which have the capability to capture a range of both surface (erosion and deposition), tectonics, as well as near surface or critical zone processes (i.e. pedogenesis). These models have a range of applications for understanding both surface and whole of landscape dynamics through to more applied situations such as degraded site rehabilitation. LEMs are now at the stage of development where if calibrated, can provide some level of reliability. However, these models are largely calibrated based on parameters determined from present surface conditions which are the product of much longer-term geology-soil-climate-vegetation interactions. Here, we assess the effect of the initial landscape dimensions and associated error as well as parameterisation for a potential post-mining landform design. The results demonstrate that subtle surface changes in the initial DEM as well as parameterisation can have a large impact on landscape behaviour, erosion depth and sediment discharge. For example, the predicted sediment output from LEM's is shown to be highly variable even with very subtle changes in initial surface conditions. This has two important implications in that decadal time scale field data is needed to (a) better parameterise models and (b) evaluate their predictions. We question how a LEM using parameters derived from field plots can firstly be employed to examine long-term landscape evolution. Secondly, the potential range of outcomes is examined based on estimated temporal parameter change and thirdly, the need for more detailed and rigorous field data for calibration and validation of these models is discussed.
Porting Initiation and Failure into Linked CHEETAH
NASA Astrophysics Data System (ADS)
Souers, Clark; Vitello, Peter
2007-06-01
Linked CHEETAH is a thermo-chemical code coupled to a 2-D hydrocode. Initially, a quadratic-pressure dependent kinetic rate was used, which worked well in modeling prompt detonation of explosives of large size, but does not work on other aspects of explosive behavior. The variable-pressure Tarantula reactive flow rate model was developed with JWL++ in order to also describe failure and initiation, and we have moved this model into Linked CHEETAH. The model works by turning on only above a pressure threshold, where a slow turn-on creates initiation. At a higher pressure, the rate suddenly leaps to a large value over a small pressure range. A slowly failing cylinder will see a rapidly declining rate, which pushes it quickly into failure. At a high pressure, the detonation rate is constant. A sequential validation procedure is used, which includes metal-confined cylinders, rate-sticks, corner-turning, initiation and threshold, gap tests and air gaps. The size (diameter) effect is central to the calibration. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.
Control of movement initiation underlies the development of balance
Ehrlich, David E.; Schoppik, David
2017-01-01
Summary Balance arises from the interplay of external forces acting on the body and internally generated movements. Many animal bodies are inherently unstable, necessitating corrective locomotion to maintain stability. Understanding how developing animals come to balance remains a challenge. Here we study the interplay between environment, sensation, and action as balance develops in larval zebrafish. We first model the physical forces that challenge underwater balance and experimentally confirm that larvae are subject to constant destabilization. Larvae propel in swim bouts that, we find, tend to stabilize the body. We confirm the relationship between locomotion and balance by changing larval body composition, exacerbating instability and eliciting more frequent swimming. Intriguingly, developing zebrafish come to control the initiation of locomotion, swimming preferentially when unstable, thus restoring preferred postures. To test the sufficiency of locomotor-driven stabilization and the developing control of movement timing, we incorporate both into a generative model of swimming. Simulated larvae recapitulate observed postures and movement timing across early development, but only when locomotor-driven stabilization and control of movement initiation are both utilized. We conclude the ability to move when unstable is the key developmental improvement to balance in larval zebrafish. Our work informs how emerging sensorimotor ability comes to impact how and why animals move when they do. PMID:28111151
Activin B promotes initiation and development of hair follicles in mice.
Jia, Qin; Zhang, Min; Kong, Yanan; Chen, Shixuan; Chen, Yinghua; Wang, Xueer; Zhang, Lei; Lang, Weiya; Zhang, Lu; Zhang, Lin
2013-01-01
Activin B has been reported to promote the regeneration of hair follicles during wound healing. However, its role in the development and life cycle of hair follicles has not been elucidated. In our study, the effect of activin B on mouse hair follicles of cultured and neonatal mouse skin was investigated. In these models, PBS or activin B (5, 10 or 50 ng/ml) was applied, and hair follicle development was monitored. Hair follicle initiation and development was examined using hematoxylin and eosin staining, alkaline phosphatase activity staining, Oil Red O+ staining, and the detection of TdT-mediated dUTP-biotin nick end-labeling cell apoptosis. Activin B was found to efficiently induce the initiation of hair follicles in the skin of both cultured and neonatal mice and to promote the development of hair follicles in neonatal mouse skin. Moreover, activin-B-treated hair follicles were observed to enter the anagen stage from the telogen stage and to remain in the anagen stage. These results demonstrate that activin B promotes the initiation and development of hair follicles in mice.
Analysis of GALE (Genesis of Atlantic Lows Experiment) Data
1989-12-01
being developed to accurately simulate and study the development of extratropical cyclones, which rapidly develop off the east coast of the U.S. and the...the model for the simulation of GALE storms . \\SAIC has worked with the NRL staff in the development of initialization schemes, includir.g a vertical...at the 6th Extratropical Cyclone Workshop of the American Meteorological Society in Monterey, CA, June, 1987, entitled "A Model for the Simulation of
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2001-01-01
Literature survey related to the EBC/TBC (environmental barrier coating/thermal barrier coating) fife models, failure mechanisms in EBC/TBC and the initial work plan for the proposed EBC/TBC life prediction methods development was developed as well as the finite element model for the thermal/stress analysis of the GRC-developed EBC system was prepared. Technical report for these activities is given in the subsequent sections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adam, J. C.; Stephens, J. C.; Chung, Serena
As managers of agricultural and natural resources are confronted with uncertainties in global change impacts, the complexities associated with the interconnected cycling of nitrogen, carbon, and water present daunting management challenges. Existing models provide detailed information on specific sub-systems (land, air, water, economics, etc). An increasing awareness of the unintended consequences of management decisions resulting from interconnectedness of these sub-systems, however, necessitates coupled regional earth system models (EaSMs). Decision makers’ needs and priorities can be integrated into the model design and development processes to enhance decision-making relevance and "usability" of EaSMs. BioEarth is a current research initiative with a focusmore » on the U.S. Pacific Northwest region that explores the coupling of multiple stand-alone EaSMs to generate usable information for resource decision-making. Direct engagement between model developers and non-academic stakeholders involved in resource and environmental management decisions throughout the model development process is a critical component of this effort. BioEarth utilizes a "bottom-up" approach, upscaling a catchment-scale model to basin and regional scales, as opposed to the "top-down" approach of downscaling global models utilized by most other EaSM efforts. This paper describes the BioEarth initiative and highlights opportunities and challenges associated with coupling multiple stand-alone models to generate usable information for agricultural and natural resource decision-making.« less
NASA Astrophysics Data System (ADS)
Singh, Nidhi; Avery, Mitchell A.; McCurdy, Christopher R.
2007-09-01
Mycobacterium tuberculosis 1-deoxy- d-xylulose-5-phosphate reductoisomerase ( MtDXR) is a potential target for antitubercular chemotherapy. In the absence of its crystallographic structure, our aim was to develop a structural model of MtDXR. This will allow us to gain early insight into the structure and function of the enzyme and its likely binding to ligands and cofactors and thus, facilitate structure-based inhibitor design. To achieve this goal, initial models of MtDXR were generated using MODELER. The best quality model was refined using a series of minimizations and molecular dynamics simulations. A protein-ligand complex was also developed from the initial homology model of the target protein by including information about the known ligand as spatial restraints and optimizing the mutual interactions between the ligand and the binding site. The final model was evaluated on the basis of its ability to explain several site-directed mutagenesis data. Furthermore, a comparison of the homology model with the X-ray structure published in the final stages of the project shows excellent agreement and validates the approach. The knowledge gained from the current study should prove useful in the design and development of inhibitors as potential novel therapeutic agents against tuberculosis by either de novo drug design or virtual screening of large chemical databases.
Sánchez, Lucas; Chaouiya, Claudine
2016-05-26
Primary sex determination in placental mammals is a very well studied developmental process. Here, we aim to investigate the currently established scenario and to assess its adequacy to fully recover the observed phenotypes, in the wild type and perturbed situations. Computational modelling allows clarifying network dynamics, elucidating crucial temporal constrains as well as interplay between core regulatory modules. Relying on a comprehensive revision of the literature, we define a logical model that integrates the current knowledge of the regulatory network controlling this developmental process. Our analysis indicates the necessity for some genes to operate at distinct functional thresholds and for specific developmental conditions to ensure the reproducibility of the sexual pathways followed by bi-potential gonads developing into either testes or ovaries. Our model thus allows studying the dynamics of wild type and mutant XX and XY gonads. Furthermore, the model analysis reveals that the gonad sexual fate results from the operation of two sub-networks associated respectively with an initiation and a maintenance phases. At the core of the process is the resolution of two connected feedback loops: the mutual inhibition of Sox9 and ß-catenin at the initiation phase, which in turn affects the mutual inhibition between Dmrt1 and Foxl2, at the maintenance phase. Three developmental signals related to the temporal activity of those sub-networks are required: a signal that determines Sry activation, marking the beginning of the initiation phase, and two further signals that define the transition from the initiation to the maintenance phases, by inhibiting the Wnt4 signalling pathway on the one hand, and by activating Foxl2 on the other hand. Our model reproduces a wide range of experimental data reported for the development of wild type and mutant gonads. It also provides a formal support to crucial aspects of the gonad sexual development and predicts gonadal phenotypes for mutations not tested yet.
Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija
2017-12-02
The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.
NASA Astrophysics Data System (ADS)
Bradley, J. A.; Anesio, A. M.; Singarayer, J. S.; Heath, M. R.; Arndt, S.
2015-10-01
SHIMMER (Soil biogeocHemIcal Model for Microbial Ecosystem Response) is a new numerical modelling framework designed to simulate microbial dynamics and biogeochemical cycling during initial ecosystem development in glacier forefield soils. However, it is also transferable to other extreme ecosystem types (such as desert soils or the surface of glaciers). The rationale for model development arises from decades of empirical observations in glacier forefields, and enables a quantitative and process focussed approach. Here, we provide a detailed description of SHIMMER, test its performance in two case study forefields: the Damma Glacier (Switzerland) and the Athabasca Glacier (Canada) and analyse sensitivity to identify the most sensitive and unconstrained model parameters. Results show that the accumulation of microbial biomass is highly dependent on variation in microbial growth and death rate constants, Q10 values, the active fraction of microbial biomass and the reactivity of organic matter. The model correctly predicts the rapid accumulation of microbial biomass observed during the initial stages of succession in the forefields of both the case study systems. Primary production is responsible for the initial build-up of labile substrate that subsequently supports heterotrophic growth. However, allochthonous contributions of organic matter, and nitrogen fixation, are important in sustaining this productivity. The development and application of SHIMMER also highlights aspects of these systems that require further empirical research: quantifying nutrient budgets and biogeochemical rates, exploring seasonality and microbial growth and cell death. This will lead to increased understanding of how glacier forefields contribute to global biogeochemical cycling and climate under future ice retreat.
Greenberg, Alexandra; Kiddell-Monroe, Rachel
2016-09-14
In recent years, the world has witnessed the tragic outcomes of multiple global health crises. From Ebola to high prices to antibiotic resistance, these events highlight the fundamental constraints of the current biomedical research and development (R&D) system in responding to patient needs globally.To mitigate this lack of responsiveness, over 100 self-identified "alternative" R&D initiatives, have emerged in the past 15 years. To begin to make sense of this panoply of initiatives working to overcome the constraints of the current system, UAEM began an extensive, though not comprehensive, mapping of the alternative biomedical R&D landscape. We developed a two phase approach: (1) an investigation, via the RE:Route Mapping, of both existing and proposed initiatives that claim to offer an alternative approach to R&D, and (2) evaluation of those initiatives to determine which are in fact achieving increased access to and innovation in medicines. Through phase 1, the RE:Route Mapping, we examined 81 initiatives that claim to redress the inequity perpetuated by the current system via one of five commonly recognized mechanisms necessary for truly alternative R&D.Preliminary analysis of phase 1 provides the following conclusions: 1. No initiative presents a completely alternative model of biomedical R&D. 2. The majority of initiatives focus on developing incentives for drug discovery. 3. The majority of initiatives focus on rare diseases or diseases of the poor and marginalized. 4. There is an increasing emphasis on the use of push, pull, pool, collaboration and open mechanisms alongside the concept of delinkage in alternative R&D. 5. There is a trend towards public funding and launching of initiatives by the Global South. Given the RE:Route Mapping's inevitable limitations and the assumptions made in its methodology, it is not intended to be the final word on a constantly evolving and complex field; however, its findings are significant. The Mapping's value lies in its timely and unique insight into the importance of ongoing efforts to develop a new global framework for biomedical R&D. As we progress to phase 2, an evaluation tool for initiatives focused on identifying which approaches have truly achieved increased innovation and access for patients, we aim to demonstrate that there are a handful of initiatives which represent some, but not all, of the building blocks for a new approach to R&D.Through this mapping and our forthcoming evaluation, UAEM aims to initiate an evidence-based conversation around a truly alternative biomedical R&D model that serves people rather than profits.
Mathematical model of solar radiation based on climatological data from NASA SSE
NASA Astrophysics Data System (ADS)
Obukhov, S. G.; Plotnikov, I. A.; Masolov, V. G.
2018-05-01
An original model of solar radiation arriving at the arbitrarily oriented surface has been developed. The peculiarity of the model is that it uses numerical values of the atmospheric transparency index and the surface albedo from the NASA SSE database as initial data. The model is developed in the MatLab/Simulink environment to predict the main characteristics of solar radiation for any geographical point in Russia, including those for territories with no regular actinometric observations.
Maine Tidal Power Initiative: Environmental Impact Protocols For Tidal Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Michael Leroy; Zydlewski, Gayle Barbin; Xue, Huijie
2014-02-02
The Maine Tidal Power Initiative (MTPI), an interdisciplinary group of engineers, biologists, oceanographers, and social scientists, has been conducting research to evaluate tidal energy resources and better understand the potential effects and impacts of marine hydro-kinetic (MHK) development on the environment and local community. Project efforts include: 1) resource assessment, 2) development of initial device design parameters using scale model tests, 3) baseline environmental studies and monitoring, and 4) human and community responses. This work included in-situ measurement of the environmental and social response to the pre-commercial Turbine Generator Unit (TGU®) developed by Ocean Renewable Power Company (ORPC) as wellmore » as considering the path forward for smaller community scale projects.« less
NASA Astrophysics Data System (ADS)
Mezzacappa, A.; Calder, A. C.; Bruenn, S. W.; Blondin, J. M.; Guidry, M. W.; Strayer, M. R.; Umar, A. S.
1998-01-01
We couple two-dimensional hydrodynamics to realistic one-dimensional multigroup flux-limited diffusion neutrino transport to investigate proto-neutron star convection in core-collapse supernovae, and more specifically, the interplay between its development and neutrino transport. Our initial conditions, time-dependent boundary conditions, and neutrino distributions for computing neutrino heating, cooling, and deleptonization rates are obtained from one-dimensional simulations that implement multigroup flux-limited diffusion and one-dimensional hydrodynamics. The development and evolution of proto-neutron star convection are investigated for both 15 and 25 M⊙ models, representative of the two classes of stars with compact and extended iron cores, respectively. For both models, in the absence of neutrino transport, the angle-averaged radial and angular convection velocities in the initial Ledoux unstable region below the shock after bounce achieve their peak values in ~20 ms, after which they decrease as the convection in this region dissipates. The dissipation occurs as the gradients are smoothed out by convection. This initial proto-neutron star convection episode seeds additional convectively unstable regions farther out beneath the shock. The additional proto-neutron star convection is driven by successive negative entropy gradients that develop as the shock, in propagating out after core bounce, is successively strengthened and weakened by the oscillating inner core. The convection beneath the shock distorts its sphericity, but on the average the shock radius is not boosted significantly relative to its radius in our corresponding one-dimensional models. In the presence of neutrino transport, proto-neutron star convection velocities are too small relative to bulk inflow velocities to result in any significant convective transport of entropy and leptons. This is evident in our two-dimensional entropy snapshots, which in this case appear spherically symmetric. The peak angle-averaged radial and angular convection velocities are orders of magnitude smaller than they are in the corresponding ``hydrodynamics-only'' models. A simple analytical model supports our numerical results, indicating that the inclusion of neutrino transport reduces the entropy-driven (lepton-driven) convection growth rates and asymptotic velocities by a factor ~3 (50) at the neutrinosphere and a factor ~250 (1000) at ρ = 1012 g cm-3, for both our 15 and 25 M⊙ models. Moreover, when transport is included, the initial postbounce entropy gradient is smoothed out by neutrino diffusion, whereas the initial lepton gradient is maintained by electron capture and neutrino escape near the neutrinosphere. Despite the maintenance of the lepton gradient, proto-neutron star convection does not develop over the 100 ms duration typical of all our simulations, except in the instance where ``low-test'' intial conditions are used, which are generated by core-collapse and bounce simulations that neglect neutrino-electron scattering and ion-ion screening corrections to neutrino-nucleus elastic scattering. Models favoring the development of proto-neutron star convection either by starting with more favorable, albeit artificial (low-test), initial conditions or by including transport corrections that were ignored in our ``fiducial'' models were considered. Our conclusions nonetheless remained the same. Evidence of proto-neutron star convection in our two-dimensional entropy snapshots was minimal, and, as in our fiducial models, the angle-averaged convective velocities when neutrino transport was included remained orders of magnitude smaller than their counterparts in the corresponding hydrodynamics-only models.
Life extending control for rocket engines
NASA Technical Reports Server (NTRS)
Lorenzo, C. F.; Saus, J. R.; Ray, A.; Carpino, M.; Wu, M.-K.
1992-01-01
The concept of life extending control is defined. A brief discussion of current fatigue life prediction methods is given and the need for an alternative life prediction model based on a continuous functional relationship is established. Two approaches to life extending control are considered: (1) the implicit approach which uses cyclic fatigue life prediction as a basis for control design; and (2) the continuous life prediction approach which requires a continuous damage law. Progress on an initial formulation of a continuous (in time) fatigue model is presented. Finally, nonlinear programming is used to develop initial results for life extension for a simplified rocket engine (model).
Survey Design for a Statewide Multimodal Transportation Forecasting Model
DOT National Transportation Integrated Search
1992-02-01
In 1990, the NMSHTD initiated an ambitious and long-term research project. The : project was to define the process for and undertake the development of a : statewide multimodal transportation forecasting model. The project commenced in : April, 1991....
Watershed and Economic Data InterOperability (WEDO)??
The annual public meeting of the Federal Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) will convene to discuss some of the latest developments in environmental modeling applications, tools and frameworks, as well as new operational initiatives for F...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savic, Vesna; Hector, Louis G.; Ezzat, Hesham
This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less
Precision pointing and control of flexible spacecraft
NASA Technical Reports Server (NTRS)
Bantell, M. H., Jr.
1987-01-01
The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.
2007-07-31
nanoscale materials for cancer diagnostics, imaging agents, and therapeutics. Recently NCL has extended its work to in vivo models and testing by...THE NATIONAL NANOTECHNOLOGY INITIATIVE Research and Development Leading to a Revolution in Technology and Industry Supplement to the President’s FY...clear national goals for Federal science and technology investments in areas ranging from nanotechnology and health research to improving
A commercially viable virtual reality knee arthroscopy training system.
McCarthy, A D; Hollands, R J
1998-01-01
Arthroscopy is a minimally invasive form of surgery used to inspect joints. It is complex to learn yet current training methods appear inadequate, thus negating the potential benefits to the patient. This paper describes the development and initial assessment of a cost-effective virtual reality based system for training surgeons in arthroscopy of the knee. The system runs on a P.C. Initial assessments by surgeons have been positive and current developments in deformable models are described.
Extension of Liouville Formalism to Postinstability Dynamics
NASA Technical Reports Server (NTRS)
Zak, Michail
2003-01-01
A mathematical formalism has been developed for predicting the postinstability motions of a dynamic system governed by a system of nonlinear equations and subject to initial conditions. Previously, there was no general method for prediction and mathematical modeling of postinstability behaviors (e.g., chaos and turbulence) in such a system. The formalism of nonlinear dynamics does not afford means to discriminate between stable and unstable motions: an additional stability analysis is necessary for such discrimination. However, an additional stability analysis does not suggest any modifications of a mathematical model that would enable the model to describe postinstability motions efficiently. The most important type of instability that necessitates a postinstability description is associated with positive Lyapunov exponents. Such an instability leads to exponential growth of small errors in initial conditions or, equivalently, exponential divergence of neighboring trajectories. The development of the present formalism was undertaken in an effort to remove positive Lyapunov exponents. The means chosen to accomplish this is coupling of the governing dynamical equations with the corresponding Liouville equation that describes the evolution of the flow of error probability. The underlying idea is to suppress the divergences of different trajectories that correspond to different initial conditions, without affecting a target trajectory, which is one that starts with prescribed initial conditions.
Test and Analysis Correlation for a Y-Joint Specimen for a Composite Cryotank
NASA Technical Reports Server (NTRS)
Mason, Brian H.; Sleight, David W.; Grenoble, Ray
2015-01-01
The Composite Cryotank Technology Demonstration (CCTD) project under NASA's Game Changing Development Program (GCDP) developed space technologies using advanced composite materials. Under CCTD, NASA funded the Boeing Company to design and test a number of element-level joint specimens as a precursor to a 2.4-m diameter composite cryotank. Preliminary analyses indicated that the y-joint in the cryotank had low margins of safety; hence the y-joint was considered to be a critical design region. The y-joint design includes a softening strip wedge to reduce localized shear stresses at the skirt/dome interface. In this paper, NASA-developed analytical models will be correlated with the experimental results of a series of positive-peel y-joint specimens from Boeing tests. Initial analytical models over-predicted the experimental strain gage readings in the far-field region by approximately 10%. The over-prediction was attributed to uncertainty in the elastic properties of the laminate and a mismatch between the thermal expansion of the strain gages and the laminate. The elastic properties of the analytical model were adjusted to account for the strain gage differences. The experimental strain gages also indicated a large non-linear effect in the softening strip region that was not predicted by the analytical model. This non-linear effect was attributed to delamination initiating in the softening strip region at below 20% of the failure load for the specimen. Because the specimen was contained in a thermally insulated box during cryogenic testing to failure, delamination initiation and progression was not visualized during the test. Several possible failure initiation locations were investigated, and a most likely failure scenario was determined that correlated well with the experimental data. The most likely failure scenario corresponded to damage initiating in the softening strip and delamination extending to the grips at final failure.
The United States Environmental Protection Agency's (EPA) National Expsoure Research Laboratory (NERL) has initiated a project to improve the methodology for modeling urban-scale human exposure to mobile source emissions. The modeling project has started by considering the nee...
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
DOT National Transportation Integrated Search
1994-10-31
The Volpe Center first estimated an inter-regional auto trip model as part of its effort to assess the market feasibility of maglev for the National Maglev Initiative (NMI). The original intent was to develop a direct demand model for estimating inte...
IVI governance structure : enabling research and development
DOT National Transportation Integrated Search
1999-11-01
This Intelligent Vehicle Initiative (IVI) Governance Model is comprised of four programs. The first is "Enabling Research and Development", designed to provide a forum for industry and government to establish, prioritize, and evaluate IVI goals and r...
Evolutionary models of rotating dense stellar systems: challenges in software and hardware
NASA Astrophysics Data System (ADS)
Fiestas, Jose
2016-02-01
We present evolutionary models of rotating self-gravitating systems (e.g. globular clusters, galaxy cores). These models are characterized by the presence of initial axisymmetry due to rotation. Central black hole seeds are alternatively included in our models, and black hole growth due to consumption of stellar matter is simulated until the central potential dominates the kinematics in the core. Goal is to study the long-term evolution (~ Gyr) of relaxed dense stellar systems, which deviate from spherical symmetry, their morphology and final kinematics. With this purpose, we developed a 2D Fokker-Planck analytical code, which results we confirm by detailed N-Body techniques, applying a high performance code, developed for GPU machines. We compare our models to available observations of galactic rotating globular clusters, and conclude that initial rotation modifies significantly the shape and lifetime of these systems, and can not be neglected in studying the evolution of globular clusters, and the galaxy itself.
Thermosolutal convection and macrosegregation in dendritic alloys
NASA Technical Reports Server (NTRS)
Poirier, David R.; Heinrich, J. C.
1993-01-01
A mathematical model of solidification, that simulates the formation of channel segregates or freckles, is presented. The model simulates the entire solidification process, starting with the initial melt to the solidified cast, and the resulting segregation is predicted. Emphasis is given to the initial transient, when the dendritic zone begins to develop and the conditions for the possible nucleation of channels are established. The mechanisms that lead to the creation and eventual growth or termination of channels are explained in detail and illustrated by several numerical examples. A finite element model is used for the simulations. It uses a single system of equations to deal with the all-liquid region, the dendritic region, and the all-solid region. The dendritic region is treated as an anisotropic porous medium. The algorithm uses the bilinear isoparametric element, with a penalty function approximation and a Petrov-Galerkin formulation. The major task was to develop the solidification model. In addition, other tasks that were performed in conjunction with the modeling of dendritic solidification are briefly described.
A Healthy Communities Initiative in Rural Alberta: Building Rural Capacity for Health.
ERIC Educational Resources Information Center
GermAnn, Kathy; Smith, Neale; Littlejohns, Lori Baugh
Efforts of health professionals are shifting away from programs that "deliver health" toward those that build the capacity of communities to work together to create healthy places. The Healthy Communities Initiative (HCI) is a community development model in central Alberta (Canada) that involves the creation of a widely shared vision of…
Kindergarten Readiness Impacts of the Arkansas Better Chance State Prekindergarten Initiative
ERIC Educational Resources Information Center
Hustedt, Jason T.; Jung, Kwanghee; Barnett, W. Steven; Williams, Tonya
2015-01-01
Enrollment in state-funded pre-K programs prior to kindergarten entry has become increasingly common. As each state develops its own model for pre-K, rigorous studies of the impacts of state-specific programs are needed. This study investigates impacts of the Arkansas Better Chance (ABC) initiative at kindergarten entry using a…
The Common Core State Standards Initiative: an Overview
ERIC Educational Resources Information Center
Watt, Michael G.
2011-01-01
The purpose of this study was to evaluate decision making in the Common Core State Standards Initiative as the change process moved from research, development and diffusion activities to adoption of the Common Core State Standards by the states. A decision-oriented evaluation model was used to describe the four stages of planning, structuring,…
Origin and initiation mechanisms of neuroblastoma.
Tsubota, Shoma; Kadomatsu, Kenji
2018-05-01
Neuroblastoma is an embryonal malignancy that affects normal development of the adrenal medulla and paravertebral sympathetic ganglia in early childhood. Extensive studies have revealed the molecular characteristics of human neuroblastomas, including abnormalities at genome, epigenome and transcriptome levels. However, neuroblastoma initiation mechanisms and even its origin are long-standing mysteries. In this review article, we summarize the current knowledge about normal development of putative neuroblastoma sources, namely sympathoadrenal lineage of neural crest cells and Schwann cell precursors that were recently identified as the source of adrenal chromaffin cells. A plausible origin of enigmatic stage 4S neuroblastoma is also discussed. With regard to the initiation mechanisms, we review genetic abnormalities in neuroblastomas and their possible association to initiation mechanisms. We also summarize evidences of neuroblastoma initiation observed in genetically engineered animal models, in which epigenetic alterations were involved, including transcriptomic upregulation by N-Myc and downregulation by polycomb repressive complex 2. Finally, several in vitro experimental methods are proposed that hopefully will accelerate our comprehension of neuroblastoma initiation. Thus, this review summarizes the state-of-the-art knowledge about the mechanisms of neuroblastoma initiation, which is critical for developing new strategies to cure children with neuroblastoma.
Hou, Li; Xie, Jianchun; Zhao, Jian; Zhao, Mengyao; Fan, Mengdie; Xiao, Qunfei; Liang, Jingjing; Chen, Feng
2017-10-01
To explore initial Maillard reaction pathways and mechanisms for maximal formation of meaty flavors in heated cysteine-xylose-glycine systems, model reactions with synthesized initial Maillard intermediates, Gly-Amadori, TTCA (2-threityl-thiazolidine-4-carboxylic acids) and Cys-Amadori, were investigated. Relative relativities were characterized by spectrophotometrically monitoring the development of colorless degradation intermediates and browning reaction products. Aroma compounds formed were determined by solid-phase microextraction combined with GC-MS and GC-olfactometry. Gly-Amadori showed the fastest reaction followed by Cys-Amadori then TTCA. Free glycine accelerated reaction of TTCA, whereas cysteine inhibited that of Gly-Amadori due to association forming relatively stable thiazolidines. Cys-Amadori/Gly had the highest reactivity in development of both meaty flavors and brown products. TTCA/Gly favored yielding meaty flavors, whereas Gly-Amadori/Cys favored generation of brown products. Conclusively, initial formation of TTCA and pathway involving TTCA with glycine were more applicable to efficiently produce processed-meat flavorings in a cysteine-xylose-glycine system. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nuclear and Non-Nuclear Airblast Effects.
1984-02-14
algorithms. 2 The above methodologr has been applied to a series of test prorlems initiated by a spherical high- explosive (HE) detonation In air . An...used, together with a real- air equation of state, to follow the development of an explosion initialized with the 1-kton standard as it reflects from the...interior. Stage (1) is not contained in our model; since the weapon mass greatly exceeds the ,mass of air contained within the initial explosion radius
Orion Parachute Riser Cutter Development
NASA Technical Reports Server (NTRS)
Oguz, Sirri; Salazar, Frank
2011-01-01
This paper presents the tests and analytical approach used on the development of a steel riser cutter for the CEV Parachute Assembly System (CPAS) used on the Orion crew module. Figure 1 shows the riser cutter and the steel riser bundle which consists of six individual cables. Due to the highly compressed schedule, initial unavailability of the riser material and the Orion Forward Bay mechanical constraints, JSC primarily relied on a combination of internal ballistics analysis and LS-DYNA simulation for this project. Various one dimensional internal ballistics codes that use standard equation of state and conservation of energy have commonly used in the development of CAD devices for initial first order estimates and as an enhancement to the test program. While these codes are very accurate for propellant performance prediction, they usually lack a fully defined kinematic model for dynamic predictions. A simple piston device can easily and accurately be modeled using an equation of motion. However, the accuracy of analytical models is greatly reduced on more complicated devices with complex external loads, nonlinear trajectories or unique unlocking features. A 3D finite element model of CAD device with all critical features included can vastly improve the analytical ballistic predictions when it is used as a supplement to the ballistic code. During this project, LS-DYNA structural 3D model was used to predict the riser resisting load that was needed for the ballistic code. A Lagrangian model with eroding elements shown in Figure 2 was used for the blade, steel riser and the anvil. The riser material failure strain was fine tuned by matching the dent depth on the anvil with the actual test data. LS-DYNA model was also utilized to optimize the blade tip design for the most efficient cut. In parallel, the propellant type and the amount were determined by using CADPROG internal ballistics code. Initial test results showed a good match with LS-DYNA and CADPROG simulations. Final paper will present a detailed roadmap from initial ballistic modeling and LS-DYNA simulation to the performance testing. Blade shape optimization study will also be presented.
Fatigue life prediction modeling for turbine hot section materials
NASA Technical Reports Server (NTRS)
Halford, G. R.; Meyer, T. G.; Nelson, R. S.; Nissley, D. M.; Swanson, G. A.
1989-01-01
A major objective of the fatigue and fracture efforts under the NASA Hot Section Technology (HOST) program was to significantly improve the analytic life prediction tools used by the aeronautical gas turbine engine industry. This was achieved in the areas of high-temperature thermal and mechanical fatigue of bare and coated high-temperature superalloys. The cyclic crack initiation and propagation resistance of nominally isotropic polycrystalline and highly anisotropic single crystal alloys were addressed. Life prediction modeling efforts were devoted to creep-fatigue interaction, oxidation, coatings interactions, multiaxiality of stress-strain states, mean stress effects, cumulative damage, and thermomechanical fatigue. The fatigue crack initiation life models developed to date include the Cyclic Damage Accumulation (CDA) and the Total Strain Version of Strainrange Partitioning (TS-SRP) for nominally isotropic materials, and the Tensile Hysteretic Energy Model for anisotropic superalloys. A fatigue model is being developed based upon the concepts of Path-Independent Integrals (PII) for describing cyclic crack growth under complex nonlinear response at the crack tip due to thermomechanical loading conditions. A micromechanistic oxidation crack extension model was derived. The models are described and discussed.
Segment-based acoustic models for continuous speech recognition
NASA Astrophysics Data System (ADS)
Ostendorf, Mari; Rohlicek, J. R.
1993-07-01
This research aims to develop new and more accurate stochastic models for speaker-independent continuous speech recognition, by extending previous work in segment-based modeling and by introducing a new hierarchical approach to representing intra-utterance statistical dependencies. These techniques, which are more costly than traditional approaches because of the large search space associated with higher order models, are made feasible through rescoring a set of HMM-generated N-best sentence hypotheses. We expect these different modeling techniques to result in improved recognition performance over that achieved by current systems, which handle only frame-based observations and assume that these observations are independent given an underlying state sequence. In the fourth quarter of the project, we have completed the following: (1) ported our recognition system to the Wall Street Journal task, a standard task in the ARPA community; (2) developed an initial dependency-tree model of intra-utterance observation correlation; and (3) implemented baseline language model estimation software. Our initial results on the Wall Street Journal task are quite good and represent significantly improved performance over most HMM systems reporting on the Nov. 1992 5k vocabulary test set.
Fatigue life prediction modeling for turbine hot section materials
NASA Technical Reports Server (NTRS)
Halford, G. R.; Meyer, T. G.; Nelson, R. S.; Nissley, D. M.; Swanson, G. A.
1988-01-01
A major objective of the fatigue and fracture efforts under the Hot Section Technology (HOST) program was to significantly improve the analytic life prediction tools used by the aeronautical gas turbine engine industry. This was achieved in the areas of high-temperature thermal and mechanical fatigue of bare and coated high-temperature superalloys. The cyclic crack initiation and propagation resistance of nominally isotropic polycrystalline and highly anisotropic single crystal alloys were addressed. Life prediction modeling efforts were devoted to creep-fatigue interaction, oxidation, coatings interactions, multiaxiality of stress-strain states, mean stress effects, cumulative damage, and thermomechanical fatigue. The fatigue crack initiation life models developed to date include the Cyclic Damage Accumulation (CDA) and the Total Strain Version of Strainrange Partitioning (TS-SRP) for nominally isotropic materials, and the Tensile Hysteretic Energy Model for anisotropic superalloys. A fatigue model is being developed based upon the concepts of Path-Independent Integrals (PII) for describing cyclic crack growth under complex nonlinear response at the crack tip due to thermomechanical loading conditions. A micromechanistic oxidation crack extension model was derived. The models are described and discussed.
Initial proposition of kinematics model for selected karate actions analysis
NASA Astrophysics Data System (ADS)
Hachaj, Tomasz; Koptyra, Katarzyna; Ogiela, Marek R.
2017-03-01
The motivation for this paper is to initially propose and evaluate two new kinematics models that were developed to describe motion capture (MoCap) data of karate techniques. We decided to develop this novel proposition to create the model that is capable to handle actions description both from multimedia and professional MoCap hardware. For the evaluation purpose we have used 25-joints data with karate techniques recordings acquired with Kinect version 2. It is consisted of MoCap recordings of two professional sport (black belt) instructors and masters of Oyama Karate. We have selected following actions for initial analysis: left-handed furi-uchi punch, right leg hiza-geri kick, right leg yoko-geri kick and left-handed jodan-uke block. Basing on evaluation we made we can conclude that both proposed kinematics models seems to be convenient method for karate actions description. From two proposed variables models it seems that global might be more useful for further usage. We think that because in case of considered punches variables seems to be less correlated and they might also be easier to interpret because of single reference coordinate system. Also principal components analysis proved to be reliable way to examine the quality of kinematics models and with the plot of the variable in principal components space we can nicely present the dependences between variables.
Model Error Estimation for the CPTEC Eta Model
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; daSilva, Arlindo
1999-01-01
Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.
ERIC Educational Resources Information Center
Klieger, Aviva; Oster-Levinz, Anat
2015-01-01
Apprenticeship and professional development schools (PDSs) are two models for teacher education. The mentors that are the focus for this research completed their initial teacher training through one of these models and now mentor in PDSs. The paper reports on how the way in which they were trained as student teachers influenced their role…
ERIC Educational Resources Information Center
Hadfield, Mark; Jopling, Michael
2014-01-01
This paper discusses the development of a model targeted at non-specialist practitioners implementing innovations that involve information and communication technology (ICT) in education. It is based on data from a national evaluation of ICT-based projects in initial teacher education, which included a large-scale questionnaire survey and six…
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
NASA Astrophysics Data System (ADS)
Liu, Yin; Zhang, Wei
2016-12-01
This study develops a proper way to incorporate Atmospheric Infrared Sounder (AIRS) ozone data into the bogus data assimilation (BDA) initialization scheme for improving hurricane prediction. First, the observation operator at some model levels with the highest correlation coefficients is established to assimilate AIRS ozone data based on the correlation between total column ozone and potential vorticity (PV) ranging from 400 to 50 hPa level. Second, AIRS ozone data act as an augmentation to a BDA procedure using a four-dimensional variational (4D-Var) data assimilation system. Case studies of several hurricanes are performed to demonstrate the effectiveness of the bogus and ozone data assimilation (BODA) scheme. The statistical result indicates that assimilating AIRS ozone data at 4, 5, or 6 model levels can produce a significant improvement in hurricane track and intensity prediction, with reasonable computation time for the hurricane initialization. Moreover, a detailed analysis of how BODA scheme affects hurricane prediction is conducted for Hurricane Earl (2010). It is found that the new scheme developed in this study generates significant adjustments in the initial conditions (ICs) from the lower levels to the upper levels, compared with the BDA scheme. With the BODA scheme, hurricane development is found to be much more sensitive to the number of ozone data assimilation levels. In particular, the experiment with the assimilation of AIRS ozone data at proper number of model levels shows great capabilities in reproducing the intensity and intensity changes of Hurricane Earl, as well as improve the track prediction. These results suggest that AIRS ozone data convey valuable meteorological information in the upper troposphere, which can be assimilated into a numerical model to improve hurricane initialization when the low-level bogus data are included.
Hand-held microwave search detector
NASA Astrophysics Data System (ADS)
Daniels, David J.; Philippakis, Mike
2005-05-01
This paper describes the further development of a patented, novel, low cost, microwave search detector using noise radar technology operating in the 27-40GHz range of frequencies, initially reported in SPIE 2004. Initial experiments have shown that plastic explosives, ceramics and plastic material hidden on the body can be detected with the system. This paper considers the basic physics of the technique and reports on the development of a initial prototype system for hand search of suspects and addresses the work carried out on optimisation of PD and FAR. The radar uses a novel lens system and the design and modelling of this for optimum depth of field of focus will be reported.
The Co-Development of Parenting Stress and Childhood Internalizing and Externalizing Problems.
Stone, Lisanne L; Mares, Suzanne H W; Otten, Roy; Engels, Rutger C M E; Janssens, Jan M A M
Although the detrimental influence of parenting stress on child problem behavior is well established, it remains unknown how these constructs affect each other over time. In accordance with a transactional model, this study investigates how the development of internalizing and externalizing problems is related to the development of parenting stress in children aged 4-9. Mothers of 1582 children participated in three one-year interval data waves. Internalizing and externalizing problems as well as parenting stress were assessed by maternal self-report. Interrelated development of parenting with internalizing and externalizing problems was examined using Latent Growth Modeling. Directionality of effects was further investigated by using cross-lagged models. Parenting stress and externalizing problems showed a decrease over time, whereas internalizing problems remained stable. Initial levels of parenting stress were related to initial levels of both internalizing and externalizing problems. Decreases in parenting stress were related to larger decreases in externalizing problems and to the (stable) course of internalizing problems. Some evidence for reciprocity was found such that externalizing problems were associated with parenting stress and vice versa over time, specifically for boys. Our findings support the transactional model in explaining psychopathology.
Mathematical Modeling of Intestinal Iron Absorption Using Genetic Programming
Colins, Andrea; Gerdtzen, Ziomara P.; Nuñez, Marco T.; Salgado, J. Cristian
2017-01-01
Iron is a trace metal, key for the development of living organisms. Its absorption process is complex and highly regulated at the transcriptional, translational and systemic levels. Recently, the internalization of the DMT1 transporter has been proposed as an additional regulatory mechanism at the intestinal level, associated to the mucosal block phenomenon. The short-term effect of iron exposure in apical uptake and initial absorption rates was studied in Caco-2 cells at different apical iron concentrations, using both an experimental approach and a mathematical modeling framework. This is the first report of short-term studies for this system. A non-linear behavior in the apical uptake dynamics was observed, which does not follow the classic saturation dynamics of traditional biochemical models. We propose a method for developing mathematical models for complex systems, based on a genetic programming algorithm. The algorithm is aimed at obtaining models with a high predictive capacity, and considers an additional parameter fitting stage and an additional Jackknife stage for estimating the generalization error. We developed a model for the iron uptake system with a higher predictive capacity than classic biochemical models. This was observed both with the apical uptake dataset used for generating the model and with an independent initial rates dataset used to test the predictive capacity of the model. The model obtained is a function of time and the initial apical iron concentration, with a linear component that captures the global tendency of the system, and a non-linear component that can be associated to the movement of DMT1 transporters. The model presented in this paper allows the detailed analysis, interpretation of experimental data, and identification of key relevant components for this complex biological process. This general method holds great potential for application to the elucidation of biological mechanisms and their key components in other complex systems. PMID:28072870
Optical systems integrated modeling
NASA Technical Reports Server (NTRS)
Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck
1992-01-01
An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.
NASA Astrophysics Data System (ADS)
Rauser, F.
2013-12-01
We present results from the German BMBF initiative 'High Definition Cloud and Precipitation for advancing Climate Prediction -HD(CP)2'. This initiative addresses most of the problems that are discussed in this session in one, unified approach: cloud physics, convection, boundary layer development, radiation and subgrid variability are approached in one organizational framework. HD(CP)2 merges both observation and high performance computing / model development communities to tackle a shared problem: how to improve the understanding of the most important subgrid-scale processes of cloud and precipitation physics, and how to utilize this knowledge for improved climate predictions. HD(CP)2 is a coordinated initiative to: (i) realize; (ii) evaluate; and (iii) statistically characterize and exploit for the purpose of both parameterization development and cloud / precipitation feedback analysis; ultra-high resolution (100 m in the horizontal, 10-50 m in the vertical) regional hind-casts over time periods (3-15 y) and spatial scales (1000-1500 km) that are climatically meaningful. HD(CP)2 thus consists of three elements (the model development and simulations, their observational evaluation and exploitation/synthesis to advance CP prediction) and its first three-year phase has started on October 1st 2012. As a central part of HD(CP)2, the HD(CP)2 Observational Prototype Experiment (HOPE) has been carried out in spring 2013. In this campaign, high resolution measurements with a multitude of instruments from all major centers in Germany have been carried out in a limited domain, to allow for unprecedented resolution and precision in the observation of microphysics parameters on a resolution that will allow for evaluation and improvement of ultra-high resolution models. At the same time, a local area version of the new climate model ICON of the Max Planck Institute and the German weather service has been developed that allows for LES-type simulations on high resolutions on limited domains. The advantage of modifying an existing, evolving climate model is to share insights from high resolution runs directly with the large-scale modelers and to allow for easy intercomparison and evaluation later on. Within this presentation, we will give a short overview on HD(CP)2 , show results from the observation campaign HOPE and the LES simulations of the same domain and conditions and will discuss how these will lead to an improved understanding and evaluation background for the efforts to improve fast physics in our climate model.
Darius M. Adams; Ralph J. Alig; J.M. Callaway; Bruce A. McCarl; Steven M. Winnett
1996-01-01
The Forest and Agricultural Sector Optimization Model (FASOM) is a dynamic, nonlinear programming model of the forest and agricultural sectors in the United States. The FASOM model initially was developed to evaluate welfare and market impacts of alternative policies for sequestering carbon in trees but also has been applied to a wider range of forest and agricultural...
Atmospheric Electrical Modeling in Support of the NASA F-106 Storm Hazards Project
NASA Technical Reports Server (NTRS)
Helsdon, John H., Jr.
1988-01-01
A recently developed storm electrification model (SEM) is used to investigate the operating environment of the F-106 airplane during the NASA Storm Hazards Project. The model is 2-D, time dependent and uses a bulkwater microphysical parameterization scheme. Electric charges and fields are included, and the model is fully coupled dynamically, microphysically and electrically. One flight showed that a high electric field was developed at the aircraft's operating altitude (28 kft) and that a strong electric field would also be found below 20 kft; however, this low-altitude, high-field region was associated with the presence of small hail, posing a hazard to the aircraft. An operational procedure to increase the frequency of low-altitude lightning strikes was suggested. To further the understanding of lightning within the cloud environment, a parameterization of the lightning process was included in the SEM. It accounted for the initiation, propagation, termination, and charge redistribution associated with an intracloud discharge. Finally, a randomized lightning propagation scheme was developed, and the effects of cloud particles on the initiation of lightning investigated.
The puzzling interpretation of NIR indices: The case of NaI2.21
NASA Astrophysics Data System (ADS)
Röck, B.; Vazdekis, A.; La Barbera, F.; Peletier, R. F.; Knapen, J. H.; Allende-Prieto, C.; Aguado, D. S.
2017-11-01
We present a detailed study of the Na I line strength index centred in the K band at 22 100 Å (NaI2.21 hereafter) relying on different samples of early-type galaxies. Consistent with previous studies, we find that the observed line strength indices cannot be fit by state-of-the-art scaled-solar stellar population models, even using our newly developed models in the near infrared (NIR). The models clearly underestimate the large NaI2.21 values measured for most early-type galaxies. However, we develop an Na-enhanced version of our newly developed models in the NIR, which - together with the effect of a bottom-heavy initial mass function - yield NaI2.21 indices in the range of the observations. Therefore, we suggest a scenario in which the combined effect of [Na/Fe] enhancement and a bottom-heavy initial mass function are mainly responsible for the large NaI2.21 indices observed for most early-type galaxies. To a smaller extent, also [C/Fe] enhancement might contribute to the large observed NaI2.21 values.
Modeling of transitional flows
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1988-01-01
An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.
NASA Astrophysics Data System (ADS)
Alfat, Sayahdin; Kimura, Masato; Firihu, Muhammad Zamrun; Rahmat
2018-05-01
In engineering area, investigation of shape effect in elastic materials was very important. It can lead changing elasticity and surface energy, and also increase of crack propagation in the material. A two-dimensional mathematical model was developed to investigation of elasticity and surface energy in elastic material by Adaptive Finite Element Method. Besides that, behavior of crack propagation has observed for every those materials. The government equations were based on a phase field approach in crack propagation model that developed by Takaishi-Kimura. This research has varied four shape domains where physical properties of materials were same (Young's modulus E = 70 GPa and Poisson's ratio ν = 0.334). Investigation assumptions were; (1) homogeneous and isotropic material, (2) there was not initial cracking at t = 0, (3) initial displacement was zero [u1, u2] = 0) at initial condition (t = 0), and (4) length of time simulation t = 5 with interval Δt = 0.005. Mode I/II or mixed mode crack propagation has been used for the numerical investigation. Results of this studies were very good and accurate to show changing energy and behavior of crack propagation. In the future time, this research can be developed to complex phenomena and domain. Furthermore, shape optimization can be investigation by the model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason D. Hales; Veena Tikare
2014-04-01
The Used Fuel Disposition (UFD) program has initiated a project to develop a hydride formation modeling tool using a hybrid Pottsphase field approach. The Potts model is incorporated in the SPPARKS code from Sandia National Laboratories. The phase field model is provided through MARMOT from Idaho National Laboratory.
A Latent Transition Analysis Model for Assessing Change in Cognitive Skills
ERIC Educational Resources Information Center
Li, Feiming; Cohen, Allan; Bottge, Brian; Templin, Jonathan
2016-01-01
Latent transition analysis (LTA) was initially developed to provide a means of measuring change in dynamic latent variables. In this article, we illustrate the use of a cognitive diagnostic model, the DINA model, as the measurement model in a LTA, thereby demonstrating a means of analyzing change in cognitive skills over time. An example is…
X-1 to X-Wings: Developing a Parametric Cost Model
NASA Technical Reports Server (NTRS)
Sterk, Steve; McAtee, Aaron
2015-01-01
In todays cost-constrained environment, NASA needs an X-Plane database and parametric cost model that can quickly provide rough order of magnitude predictions of cost from initial concept to first fight of potential X-Plane aircraft. This paper takes a look at the steps taken in developing such a model and reports the results. The challenges encountered in the collection of historical data and recommendations for future database management are discussed. A step-by-step discussion of the development of Cost Estimating Relationships (CERs) is then covered.
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
The Virtual Physiological Human - a European initiative for in silico human modelling -.
Viceconti, Marco; Clapworthy, Gordon; Van Sint Jan, Serge
2008-12-01
The Virtual Physiological Human (VPH) is an initiative, strongly supported by the European Commission (EC), that seeks to develop an integrated model of human physiology at multiple scales from the whole body through the organ, tissue, cell and molecular levels to the genomic level. VPH had its beginnings in 2005 with informal discussions amongst like-minded scientists which led to the STEP project, a Coordination Action funded by the EC that began in early 2006. The STEP project greatly accelerated the progress of the VPH and proved to be a catalyst for wide-ranging discussions within Europe and for outreach activities designed to develop a broad international approach to the huge scientific and technological challenges involved in this area. This paper provides an overview of the VPH and the developments it has engendered in the rapidly expanding worldwide activities associated with the physiome. It then uses one particular project, the Living Human Project, to illustrate the type of advances that are taking place to further the aims of the VPH and similar initiatives worldwide.
Specifications of Standards in Systems and Synthetic Biology.
Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan
2015-09-04
Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.
Vaught, Jimmie; Rogers, Joyce; Carolin, Todd; Compton, Carolyn
2011-01-01
The preservation of high-quality biospecimens and associated data for research purposes is being performed in variety of academic, government, and industrial settings. Often these are multimillion dollar operations, yet despite these sizable investments, the economics of biobanking initiatives is not well understood. Fundamental business principles must be applied to the development and operation of such resources to ensure their long-term sustainability and maximize their impact. The true costs of developing and maintaining operations, which may have a variety of funding sources, must be better understood. Among the issues that must be considered when building a biobank economic model are: understanding the market need for the particular type of biobank under consideration and understanding and efficiently managing the biobank's "value chain," which includes costs for case collection, tissue processing, storage management, sample distribution, and infrastructure and administration. By using these value chain factors, a Total Life Cycle Cost of Ownership (TLCO) model may be developed to estimate all costs arising from owning, operating, and maintaining a large centralized biobank. The TLCO approach allows for a better delineation of a biobank's variable and fixed costs, data that will be needed to implement any cost recovery program. This article represents an overview of the efforts made recently by the National Cancer Institute's Office of Biorepositories and Biospecimen Research as part of its effort to develop an appropriate cost model and cost recovery program for the cancer HUman Biobank (caHUB) initiative. All of these economic factors are discussed in terms of maximizing caHUB's potential for long-term sustainability but have broad applicability to the wide range of biobanking initiatives that currently exist.
Dong, Min; McGann, Patrick T; Mizuno, Tomoyuki; Ware, Russell E; Vinks, Alexander A
2016-04-01
Hydroxyurea has emerged as the primary disease-modifying therapy for patients with sickle cell anaemia (SCA). The laboratory and clinical benefits of hydroxyurea are optimal at maximum tolerated dose (MTD), but the current empirical dose escalation process often takes up to 12 months. The purpose of this study was to develop a pharmacokinetic-guided dosing strategy to reduce the time required to reach hydroxyurea MTD in children with SCA. Pharmacokinetic (PK) data from the HUSTLE trial (NCT00305175) were used to develop a population PK model using non-linear mixed effects modelling (nonmem 7.2). A D-optimal sampling strategy was developed to estimate individual PK and hydroxyurea exposure (area under the concentration-time curve (AUC)). The initial AUC target was derived from HUSTLE clinical data and defined as the mean AUC at MTD. PK profiles were best described by a one compartment with Michaelis-Menten elimination and a transit absorption model. Body weight and cystatin C were identified as significant predictors of hydroxyurea clearance. The following clinically feasible sampling times are included in a new prospective protocol: pre-dose (baseline), 15-20 min, 50-60 min and 3 h after an initial 20 mg kg(-1) oral dose. The mean target AUC(0,∞) for initial dose titration was 115 mg l(-1) h. We developed a PK model-based individualized dosing strategy for the prospective Therapeutic Response Evaluation and Adherence Trial (TREAT, ClinicalTrials.gov NCT02286154). This approach has the potential to optimize the dose titration of hydroxyurea therapy for children with SCA, such that the clinical benefits at MTD are achieved more quickly. © 2015 The British Pharmacological Society.
Developing Practice: Teaching Teachers Today for Tomorrow
ERIC Educational Resources Information Center
Mays, Tony John
2011-01-01
This paper argues that the development of classroom practice is central to the purpose of the IPET (initial professional education and training) of teachers. Notwithstanding the growing use of ICTs (information and communication technologies), both in teacher development and school classrooms, the normative modeling of appropriate contact-based…
NASA Astrophysics Data System (ADS)
Lee, H.; Seo, D.; Koren, V.
2008-12-01
A prototype 4DVAR (four-dimensional variational) data assimilator for gridded Sacramento soil-moisture accounting and kinematic-wave routing models in the Hydrology Laboratory's Research Distributed Hydrologic Model (HL-RDHM) has been developed. The prototype assimilates streamflow and in-situ soil moisture data and adjusts gridded precipitation and climatological potential evaporation data to reduce uncertainty in the model initial conditions for improved monitoring and prediction of streamflow and soil moisture at the outlet and interior locations within the catchment. Due to large degrees of freedom involved, data assimilation (DA) into distributed hydrologic models is complex. To understand and assess sensitivity of the performance of DA to uncertainties in the model initial conditions and in the data, two synthetic experiments have been carried out in an ensemble framework. Results from the synthetic experiments shed much light on the potential and limitations with DA into distributed models. For initial real-world assessment, the prototype DA has also been applied to the headwater basin at Eldon near the Oklahoma-Arkansas border. We present these results and describe the next steps.
Zhang, Miaomiao; Wells, William M; Golland, Polina
2017-10-01
We present an efficient probabilistic model of anatomical variability in a linear space of initial velocities of diffeomorphic transformations and demonstrate its benefits in clinical studies of brain anatomy. To overcome the computational challenges of the high dimensional deformation-based descriptors, we develop a latent variable model for principal geodesic analysis (PGA) based on a low dimensional shape descriptor that effectively captures the intrinsic variability in a population. We define a novel shape prior that explicitly represents principal modes as a multivariate complex Gaussian distribution on the initial velocities in a bandlimited space. We demonstrate the performance of our model on a set of 3D brain MRI scans from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Our model yields a more compact representation of group variation at substantially lower computational cost than the state-of-the-art method such as tangent space PCA (TPCA) and probabilistic principal geodesic analysis (PPGA) that operate in the high dimensional image space. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.
1987-01-01
Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.
Quasi steady-state aerodynamic model development for race vehicle simulations
NASA Astrophysics Data System (ADS)
Mohrfeld-Halterman, J. A.; Uddin, M.
2016-01-01
Presented in this paper is a procedure to develop a high fidelity quasi steady-state aerodynamic model for use in race car vehicle dynamic simulations. Developed to fit quasi steady-state wind tunnel data, the aerodynamic model is regressed against three independent variables: front ground clearance, rear ride height, and yaw angle. An initial dual range model is presented and then further refined to reduce the model complexity while maintaining a high level of predictive accuracy. The model complexity reduction decreases the required amount of wind tunnel data thereby reducing wind tunnel testing time and cost. The quasi steady-state aerodynamic model for the pitch moment degree of freedom is systematically developed in this paper. This same procedure can be extended to the other five aerodynamic degrees of freedom to develop a complete six degree of freedom quasi steady-state aerodynamic model for any vehicle.
ImTK: an open source multi-center information management toolkit
NASA Astrophysics Data System (ADS)
Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.
2008-03-01
The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.
Operations and support cost modeling of conceptual space vehicles
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1994-01-01
The University of Dayton is pleased to submit this annual report to the National Aeronautics and Space Administration (NASA) Langley Research Center which documents the development of an operations and support (O&S) cost model as part of a larger life cycle cost (LCC) structure. It is intended for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of an operations and support life cycle cost model. Cost categories were initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. A revised cost element structure (CES), which is currently under study by NASA, was used to established the basic cost elements used in the model. While the focus of the effort was on operations and maintenance costs and other recurring costs, the computerized model allowed for other cost categories such as RDT&E and production costs to be addressed. Secondary tasks performed concurrent with the development of the costing model included support and upgrades to the reliability and maintainability (R&M) model. The primary result of the current research has been a methodology and a computer implementation of the methodology to provide for timely operations and support cost analysis during the conceptual design activities.
2012-01-01
Background The Danish Multiple Sclerosis Society initiated a large-scale bridge building and integrative treatment project to take place from 2004–2010 at a specialized Multiple Sclerosis (MS) hospital. In this project, a team of five conventional health care practitioners and five alternative practitioners was set up to work together in developing and offering individualized treatments to 200 people with MS. The purpose of this paper is to present results from the six year treatment collaboration process regarding the development of an integrative treatment model. Discussion The collaborative work towards an integrative treatment model for people with MS, involved six steps: 1) Working with an initial model 2) Unfolding the different treatment philosophies 3) Discussing the elements of the Intervention-Mechanism-Context-Outcome-scheme (the IMCO-scheme) 4) Phrasing the common assumptions for an integrative MS program theory 5) Developing the integrative MS program theory 6) Building the integrative MS treatment model. The model includes important elements of the different treatment philosophies represented in the team and thereby describes a common understanding of the complexity of the courses of treatment. Summary An integrative team of practitioners has developed an integrative model for combined treatments of People with Multiple Sclerosis. The model unites different treatment philosophies and focuses on process-oriented factors and the strengthening of the patients’ resources and competences on a physical, an emotional and a cognitive level. PMID:22524586
Impact of the Alzheimer's Disease Neuroimaging Initiative, 2004 to 2014.
Weiner, Michael W; Veitch, Dallas P; Aisen, Paul S; Beckett, Laurel A; Cairns, Nigel J; Cedarbaum, Jesse; Donohue, Michael C; Green, Robert C; Harvey, Danielle; Jack, Clifford R; Jagust, William; Morris, John C; Petersen, Ronald C; Saykin, Andrew J; Shaw, Leslie; Thompson, Paul M; Toga, Arthur W; Trojanowski, John Q
2015-07-01
The Alzheimer's Disease Neuroimaging Initiative (ADNI) was established in 2004 to facilitate the development of effective treatments for Alzheimer's disease (AD) by validating biomarkers for AD clinical trials. We searched for ADNI publications using established methods. ADNI has (1) developed standardized biomarkers for use in clinical trial subject selection and as surrogate outcome measures; (2) standardized protocols for use across multiple centers; (3) initiated worldwide ADNI; (4) inspired initiatives investigating traumatic brain injury and post-traumatic stress disorder in military populations, and depression, respectively, as an AD risk factor; (5) acted as a data-sharing model; (6) generated data used in over 600 publications, leading to the identification of novel AD risk alleles, and an understanding of the relationship between biomarkers and AD progression; and (7) inspired other public-private partnerships developing biomarkers for Parkinson's disease and multiple sclerosis. ADNI has made myriad impacts in its first decade. A competitive renewal of the project in 2015 would see the use of newly developed tau imaging ligands, and the continued development of recruitment strategies and outcome measures for clinical trials. Copyright © 2015 The Alzheimer's Association. All rights reserved.
Impact of the Alzheimer’s Disease Neuroimaging Initiative, 2004 to 2014
Weiner, Michael W.; Veitch, Dallas P.; Aisen, Paul S.; Beckett, Laurel A.; Cairns, Nigel J.; Cedarbaum, Jesse; Donohue, Michael C.; Green, Robert C.; Harvey, Danielle; Jack, Clifford R.; Jagust, William; Morris, John C.; Petersen, Ronald C.; Saykin, Andrew J.; Shaw, Leslie; Thompson, Paul M.; Toga, Arthur W.; Trojanowski, John Q.
2015-01-01
Introduction The Alzheimer’s Disease Neuroimaging Initiative (ADNI) was established in 2004 to facilitate the development of effective treatments for Alzheimer’s disease (AD) by validating biomarkers for AD clinical trials. Methods We searched for ADNI publications using established methods. Results ADNI has (1) developed standardized biomarkers for use in clinical trial subject selection and as surrogate outcome measures; (2) standardized protocols for use across multiple centers; (3) initiated worldwide ADNI; (4) inspired initiatives investigating traumatic brain injury and post-traumatic stress disorder in military populations, and depression, respectively, as an AD risk factor; (5) acted as a data-sharing model; (6) generated data used in over 600 publications, leading to the identification of novel AD risk alleles, and an understanding of the relationship between biomarkers and AD progression; and (7) inspired other public-private partnerships developing biomarkers for Parkinson’s disease and multiple sclerosis. Discussion ADNI has made myriad impacts in its first decade. A competitive renewal of the project in 2015 would see the use of newly developed tau imaging ligands, and the continued development of recruitment strategies and outcome measures for clinical trials. PMID:26194320
Crowell, Sheila E.; Beauchaine, Theodore P.; Linehan, Marsha M.
2009-01-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. PMID:19379027
Crowell, Sheila E; Beauchaine, Theodore P; Linehan, Marsha M
2009-05-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
A Damage Model for the Simulation of Delamination in Advanced Composites under Variable-Mode Loading
NASA Technical Reports Server (NTRS)
Turon, A.; Camanho, P. P.; Costa, J.; Davila, C. G.
2006-01-01
A thermodynamically consistent damage model is proposed for the simulation of progressive delamination in composite materials under variable-mode ratio. The model is formulated in the context of Damage Mechanics. A novel constitutive equation is developed to model the initiation and propagation of delamination. A delamination initiation criterion is proposed to assure that the formulation can account for changes in the loading mode in a thermodynamically consistent way. The formulation accounts for crack closure effects to avoid interfacial penetration of two adjacent layers after complete decohesion. The model is implemented in a finite element formulation, and the numerical predictions are compared with experimental results obtained in both composite test specimens and structural components.
NASA Technical Reports Server (NTRS)
Volk, Tyler
1992-01-01
The goal of this research is to develop a progressive series of mathematical models for the CELSS hydroponic crops. These models will systematize the experimental findings from the crop researchers in the CELSS Program into a form useful to investigate system-level considerations, for example, dynamic studies of the CELSS Initial Reference Configurations. The crop models will organize data from different crops into a common modeling framework. This is the fifth semiannual report for this project. The following topics are discussed: (1) use of field crop models to explore phasic control of CELSS crops for optimizing yield; (2) seminar presented at Purdue CELSS NSCORT; and (3) paper submitted on analysis of bioprocessing of inedible plant materials.
NASA Astrophysics Data System (ADS)
Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.
2018-01-01
The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.
Computerized Adaptive Assessment of Personality Disorder: Introducing the CAT-PD Project
Simms, Leonard J.; Goldberg, Lewis R.; Roberts, John E.; Watson, David; Welte, John; Rotterman, Jane H.
2011-01-01
Assessment of personality disorders (PD) has been hindered by reliance on the problematic categorical model embodied in the most recent Diagnostic and Statistical Model of Mental Disorders (DSM), lack of consensus among alternative dimensional models, and inefficient measurement methods. This article describes the rationale for and early results from an NIMH-funded, multi-year study designed to develop an integrative and comprehensive model and efficient measure of PD trait dimensions. To accomplish these goals, we are in the midst of a five-phase project to develop and validate the model and measure. The results of Phase 1 of the project—which was focused on developing the PD traits to be assessed and the initial item pool—resulted in a candidate list of 59 PD traits and an initial item pool of 2,589 items. Data collection and structural analyses in community and patient samples will inform the ultimate structure of the measure, and computerized adaptive testing (CAT) will permit efficient measurement of the resultant traits. The resultant Computerized Adaptive Test of Personality Disorder (CAT-PD) will be well positioned as a measure of the proposed DSM-5 PD traits. Implications for both applied and basic personality research are discussed. PMID:22804677
An animal model of tinnitus: a decade of development.
Jastreboff, P J; Sasaki, C T
1994-01-01
Although tinnitus affects approximately 9 million people in the United States, a cure remains elusive and the mechanisms of its origin are speculative. The crucial obstacle in tinnitus research has been the lack of an animal model. Over the last decade we have been creating such a model by combining a variety of methodologies, including a behavioral component, to allow for the detection of tinnitus perception. Initially, 2-deoxyglucose had been used to map changes in the metabolic activity after unilateral destruction of the cochlea. It has been found that the initial decrease of the metabolic rate in the auditory nuclei recovered to preoperative values, which could be attributable to the development of tinnitus. The spontaneous activity of single units recorded from the inferior colliculus before and after salicylate administration revealed an increase of discharges, which might reflect the presence of salicylate-induced tinnitus. Recent data have confirmed, and further elaborated this observation, including the discovery of abnormal, epileptic-like, neuronal activity. Finally, the authors have developed a behavioral model of tinnitus, tested it extensively, and used it to measure tinnitus pitch and loudness. The model is presently used for investigating the hypotheses for the mechanisms of tinnitus.
Organosolv delignification of Eucalyptus globulus: Kinetic study of autocatalyzed ethanol pulping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliet, M.; Rodriguez, F.; Santos, A.
2000-01-01
The autocatalyzed delignification of Eucalyptus globulus in 50% ethanol (w/w) was modeled as the irreversible and consecutive dissolution of initial, bulk, and residual lignin. Their respective contributions to total lignin was estimated as 9, 75, and 16%. Isothermal pulping experiments were carried out to evaluate an empirical kinetic model among eight proposals corresponding to different reaction schemes. The calculated activation energy was found to be 96.5, 98.5, and 40.8 kJ/mol for initial, bulk, and residual delignification, respectively. The influence of hydrogen ion concentration was expressed by a power-law function model. The kinetic model developed here was validated using data frommore » nonisothermal pulping runs.« less
NASA Astrophysics Data System (ADS)
Temnikov, A. G.; Chernenskii, L. L.; Orlov, A. V.; Lysov, N. Yu.; Belova, O. S.; Gerastenok, T. K.; Zhuravkova, D. S.
2017-12-01
We have experimentally studied how arrays of model coarse hydrometeors influence the initiation and propagation of discharge between an artificial-thunderstorm cell of negative or positive polarity and the ground. It is established for the first time that the probability of initiation and stimulation of a channeled discharge between negatively or positively charged cloud and the ground significantly depends on the shape and size of coarse hydrometeors occurring near the thunderstorm cell boundaries. The obtained results can be used in developing methods for the artificial initiation of the cloud-ground type lightning of both polarities and targeted discharge of thunderstorm clouds.
Fatigue and fracture: Overview
NASA Technical Reports Server (NTRS)
Halford, G. R.
1984-01-01
A brief overview of the status of the fatigue and fracture programs is given. The programs involve the development of appropriate analytic material behavior models for cyclic stress-strain-temperature-time/cyclic crack initiation, and cyclic crack propagation. The underlying thrust of these programs is the development and verification of workable engineering methods for the calculation, in advance of service, of the local cyclic stress-strain response at the critical life governing location in hot section compounds, and the resultant crack initiation and crack growth lifetimes.
The development of structure in the expanding universe
NASA Technical Reports Server (NTRS)
Silk, J.; White, S. D.
1978-01-01
A model for clustering in an expanding universe is developed based on an application of the coagulation equation to the collision and aggregation of bound condensations. While the growth rate of clustering is determined by the rate at which density fluctuations reach the nonlinear regime and therefore depends on the initial fluctuation spectrum, the mass spectrum rapidly approaches a self-similar limiting form. This form is determined by the tidal processes which lead to the merging of condensations, and is not dependent on initial conditions.
NASA Technical Reports Server (NTRS)
Lapenta, William M.; Bradshaw, Tom; Burks, Jason; Darden, Chris; Dembek, Scott
2003-01-01
It is well known that numerical warm season quantitative precipitation forecasts lack significant skill for numerous reasons. Some are related to the model--it may lack physical processes required to realistically simulate convection or the numerical algorithms and dynamics employed may not be adequate. Others are related to initialization-mesoscale features play an important role in convective initialization and atmospheric observation systems are incapable of properly depicting the three-dimensional stability structure at the mesoscale. The purpose of this study is to determine if a mesoscale model initialized with a diabatic initialization scheme can improve short-term (0 to 12h) warm season quantitative precipitation forecasts in the Southeastern United States. The Local Analysis and Prediction System (LAPS) developed at the Forecast System Laboratory is used to diabatically initialize the Pennsylvania State University/National center for Atmospheric Research (PSUNCAR) Mesoscale Model version 5 (MM5). The SPORT Center runs LAPS operationally on an hourly cycle to produce analyses on a 15 km covering the eastern 2/3 of the United States. The 20 km National Centers for Environmental Prediction (NCEP) Rapid Update Cycle analyses are used for the background fields. Standard observational data are acquired from MADIS with GOES/CRAFT Nexrad data acquired from in-house feeds. The MM5 is configured on a 140 x 140 12 km grid centered on Huntsville Alabama. Preliminary results indicate that MM5 runs initialized with LAPS produce improved 6 and 12h QPF threat scores compared with those initialized with the NCEP RUC.
Multiscale Materials Modeling in an Industrial Environment.
Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard
2016-06-07
In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
NASA Technical Reports Server (NTRS)
Khazanov, G. V.; Gamayunov, K. V.; Jordanova, V. K.; Krivorutsky, E. N.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
Initial results from the new developed model of the interacting ring current ions and ion cyclotron waves are presented. The model described by the system of two bound kinetic equations: one equation describes the ring current ion dynamics, and another one gives wave evolution. Such system gives a self-consistent description of the ring current ions and ion cyclotron waves in a quasilinear approach. Calculating ion-wave relationships, on a global scale under non steady-state conditions during May 2-5, 1998 storm, we presented the data at three time cuts around initial, main, and late recovery phases of May 4, 1998 storm phase. The structure and dynamics of the ring current proton precipitating flux regions and the wave active ones are discussed in detail.
Development and initial validation of the Impression Motivation in Sport Questionnaire-Team.
Payne, Simon Mark; Hudson, Joanne; Akehurst, Sally; Ntoumanis, Nikos
2013-06-01
Impression motivation is an important individual difference variable that has been under-researched in sport psychology, partly due to having no appropriate measure. This study was conducted to design a measure of impression motivation in team-sport athletes. Construct validity checks decreased the initial pool of items, factor analysis (n = 310) revealed the structure of the newly developed scale, and exploratory structural equation modeling procedures (n = 406) resulted in a modified scale that retained theoretical integrity and psychometric parsimony. This process produced a 15-item, 4-factor model; the Impression Motivation in Sport Questionnaire-Team (IMSQ-T) is forwarded as a valid measure of the respondent's dispositional strength of motivation to use self-presentation in striving for four distinct interpersonal objectives: self-development, social identity development, avoidance of negative outcomes, and avoidance of damaging impressions. The availability of this measure has contributed to theoretical development, will facilitate research, and offers a tool for use in applied settings.
Paving the critical path: how can clinical pharmacology help achieve the vision?
Lesko, L J
2007-02-01
It has been almost 3 years since the launch of the FDA critical path initiative following the publication of the paper "Innovation or Stagnation: Challenges and Opportunities on the Critical Path of New Medical Product Development." The initiative was intended to create an urgency with the drug development enterprise to address the so-called "productivity problem" in modern drug development. Clinical pharmacologists are strategically aligned with solutions designed to reduce late phase clinical trial failures to show adequate efficacy and/or safety. This article reviews some of the ways that clinical pharmacologists can lead and implement change in the drug development process. It includes a discussion of model-based, semi-mechanistic drug development, drug/disease models that facilitate informed clinical trial designs and optimal dosing, the qualification process and criteria for new biomarkers and surrogate endpoints, approaches to streamlining clinical trials and new types of interaction between industry and FDA such as the end-of-phase 2A and voluntary genomic data submission meetings respectively.
NASA Technical Reports Server (NTRS)
Klumpar, D. M. (Principal Investigator)
1982-01-01
The status of the initial testing of the modeling procedure developed to compute the magnetic fields at satellite orbit due to current distributions in the ionosphere and magnetosphere is reported. The modeling technique utilizes a linear current element representation of the large scale space-current system.
VISIONS2 Learning for Life Initiative. Workplace Literacy Implementation Model.
ERIC Educational Resources Information Center
Walsh, Chris L.; Ferguson, Susan E.; Taylor, Mary Lou
This document presents a model for implementing workplace literacy education that focuses on giving front-line workers or first-line workers basic skills instruction and an appreciation for lifelong learning. The introduction presents background information on the model, which was developed during a partnership between a technical college and an…
Stratospheric chemistry and transport
NASA Technical Reports Server (NTRS)
Prather, Michael; Garcia, Maria M.
1990-01-01
A Chemical Tracer Model (CTM) that can use wind field data generated by the General Circulation Model (GCM) is developed to implement chemistry in the three dimensional GCM of the middle atmosphere. Initially, chemical tracers with simple first order losses such as N2O are used. Successive models are to incorporate more complex ozone chemistry.
ERIC Educational Resources Information Center
Goswami, Usha
1993-01-01
Three experiments on vowel decoding involving primary school children partially tested an interactive model of reading acquisition. The model suggests that children begin learning to read by establishing orthographic recognition units for words that have phonological underpinning that is initially at the onset-rime level but that becomes…
Learning Goal Orientation, Formal Mentoring, and Leadership Competence in HRD: A Conceptual Model
ERIC Educational Resources Information Center
Kim, Sooyoung
2007-01-01
Purpose: The purpose of this paper is to suggest a conceptual model of formal mentoring as a leadership development initiative including "learning goal orientation", "mentoring functions", and "leadership competencies" as key constructs of the model. Design/methodology/approach: Some empirical studies, though there are not many, will provide…
Torn in Two: An Examiniation of Elementary School Counselors' Perceptions on Self-Efficacy
ERIC Educational Resources Information Center
Sesto, Casper
2013-01-01
The American School Counselor Association (ASCA; The ASCA National Model: A Framework for School Counseling Programs, 2005) developed the ASCA National Model to define the prescribed roles and functions of the professional school counselor. Although the national model initially defines school counselors' roles, counselors find it difficult to…
Preface: Special issue of Atmospheric Environment for AQMEII
In December 2008, a handful of European and North American scientists got together to discuss a possible collaboration on the evaluation of regional-scale air quality models. This led to the development of the Air Quality Model Evaluation International Initiative (AQMEII) with th...
NASA Technical Reports Server (NTRS)
Case, Jonathan; Blottman, Pete; Hoeth, Brian; Oram, Timothy
2006-01-01
The Weather Research and Forecasting (WRF) model is the next generation community mesoscale model designed to enhance collaboration between the research and operational sectors. The NM'S as a whole has begun a transition toward WRF as the mesoscale model of choice to use as a tool in making local forecasts. Currently, both the National Weather Service in Melbourne, FL (NWS MLB) and the Spaceflight Meteorology Group (SMG) are running the Advanced Regional Prediction System (AIRPS) Data Analysis System (ADAS) every 15 minutes over the Florida peninsula to produce high-resolution diagnostics supporting their daily operations. In addition, the NWS MLB and SMG have used ADAS to provide initial conditions for short-range forecasts from the ARPS numerical weather prediction (NWP) model. Both NM'S MLB and SMG have derived great benefit from the maturity of ADAS, and would like to use ADAS for providing initial conditions to WRF. In order to assist in this WRF transition effort, the Applied Meteorology Unit (AMU) was tasked to configure and implement an operational version of WRF that uses output from ADAS for the model initial conditions. Both agencies asked the AMU to develop a framework that allows the ADAS initial conditions to be incorporated into the WRF Environmental Modeling System (EMS) software. Developed by the NM'S Science Operations Officer (S00) Science and Training Resource Center (STRC), the EMS is a complete, full physics, NWP package that incorporates dynamical cores from both the National Center for Atmospheric Research's Advanced Research WRF (ARW) and the National Centers for Environmental Prediction's Non-Hydrostatic Mesoscale Model (NMM) into a single end-to-end forecasting system. The EMS performs nearly all pre- and postprocessing and can be run automatically to obtain external grid data for WRF boundary conditions, run the model, and convert the data into a format that can be readily viewed within the Advanced Weather Interactive Processing System. The EMS has also incorporated the WRF Standard Initialization (SI) graphical user interface (GUT), which allows the user to set up the domain, dynamical core, resolution, etc., with ease. In addition to the SI GUT, the EMS contains a number of configuration files with extensive documentation to help the user select the appropriate input parameters for model physics schemes, integration timesteps, etc. Therefore, because of its streamlined capability, it is quite advantageous to configure ADAS to provide initial condition data to the EMS software. One of the biggest potential benefits of configuring ADAS for ingest into the EMS is that the analyses could be used to initialize either the ARW or NMM. Currently, the ARPS/ADAS software has a conversion routine only for the ARW dynamical core. However, since the NIvIM runs about 2.5 times faster than the ARW, it is quite advantageous to be able to run an ADAS/NMM configuration operationally due to the increased efficiency.
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
The Measure of Adolescent Heterosocial Competence: Development and Initial Validation
ERIC Educational Resources Information Center
Grover, Rachel L.; Nangle, Douglas W.; Zeff, Karen R.
2005-01-01
We developed and began construct validation of the Measure of Adolescent Heterosocial Competence (MAHC), a self-report instrument assessing the ability to negotiate effectively a range of challenging other-sex social interactions. Development followed the Goldfried and D'Zurilla (1969) behavioral-analytic model for assessing competence.…
A 10-Year Mechatronics Curriculum Development Initiative: Relevance, Content, and Results--Part II
ERIC Educational Resources Information Center
Krishnan, M.; Das, S.; Yost, S. A.
2010-01-01
This paper describes the second and third phases of a comprehensive mechatronics curriculum development effort. They encompass the development of two advanced mechatronics courses ("Simulation and Modeling of Mechatronic Systems" and "Sensors and Actuators for Mechatronic Systems"), the formulation of a Mechatronics concentration, and offshoot…
Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika
2013-01-01
The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.
Developing Interview Skills and Visual Literacy: A New Model of Engagement for Academic Libraries
ERIC Educational Resources Information Center
Denda, Kayo
2015-01-01
This case study presents a cocurricular initiative at the Margery Somers Foster Center at Rutgers University Libraries in New Brunswick, NJ. The initiative resulted in an interview workshop for the course Knowledge and Power, a "mission course" of the Douglass Residential College. This discussion-based workshop uses visual and multimedia…
ERIC Educational Resources Information Center
White, Bradford R.; Colaninno, Carol E.; Doll, Mimi; Lewandowski, Holly
2017-01-01
The Early Childhood Innovation Zones initiative was established by Illinois Action for Children (IAFC), with guidance from the Illinois Governor's Office for Early Childhood Development. Funded by a Race to the Top--Early Learning Challenge grant, this initiative supported capacity building efforts for organizations working with young children in…
ERIC Educational Resources Information Center
Downs, Colleen Thelma
2010-01-01
A life sciences undergraduate apprenticeship initiative was run during the vacations at a South African university. In particular, the initiative aimed to increase the number of students from disadvantaged backgrounds. Annually 12-18 undergraduate biology students were apprenticed to various institutions during the January and July vacations from…
ERIC Educational Resources Information Center
Coatsworth, J. Douglas; Conroy, David E.
2009-01-01
This study tested a sequential process model linking youth sport coaching climates (perceived coach behaviors and perceived need satisfaction) to youth self-perceptions (perceived competence and global self-esteem) and youth development outcomes (initiative, identity reflection, identity exploration). A sample of 119 youth between the ages of 10…
An imputed forest composition map for New England screened by species range boundaries
Matthew J. Duveneck; Jonathan R. Thompson; B. Tyler Wilson
2015-01-01
Initializing forest landscape models (FLMs) to simulate changes in tree species composition requires accurate fine-scale forest attribute information mapped continuously over large areas. Nearest-neighbor imputation maps, maps developed from multivariate imputation of field plots, have high potential for use as the initial condition within FLMs, but the tendency for...
ERIC Educational Resources Information Center
Mumford, Michael D.; And Others
A multivariate modeling approach was developed to assess the impact of changes in aptitude requirement minimums on U.S. Air Force technical training outcomes. Initially, interviews were conducted with technical training personnel to identify significant student inputs, course content, and training outcome variables. Measures of these variables…
ERIC Educational Resources Information Center
Radnor, Hilary
The Moderation and Assessment Project, South West, was an outgrowth of the Technical and Vocational Educational Initiative of the government of the United Kingdom that attempted to develop more courses with vocational relevance for adolescents. Growing from research projects under the Moderation and Assessment project, a new model of moderation is…
ERIC Educational Resources Information Center
Clench, Hugh; King, Brian Smyth
2014-01-01
This paper describes the development of an online training model for teachers and teaching assistants working with students with special educational needs. Originally developed as part of a government funded initiative in the UK, the model has been successfully applied in other contexts, most notably in New South Wales, Australia where it has had…
McDonald, Richard; Nelson, Jonathan; Kinzel, Paul; Conaway, Jeffrey S.
2006-01-01
The Multi-Dimensional Surface-Water Modeling System (MD_SWMS) is a Graphical User Interface for surface-water flow and sediment-transport models. The capabilities of MD_SWMS for developing models include: importing raw topography and other ancillary data; building the numerical grid and defining initial and boundary conditions; running simulations; visualizing results; and comparing results with measured data.
Developing an Information Resources Management Curriculum.
ERIC Educational Resources Information Center
Montie, Irene C.
1983-01-01
Discusses the development of an Information Resources Management (IRM) curriculum by the IRM Curriculum Advisory Committee established by the Graduate School, United States Department of Agriculture. Initial activities, models proposed for the program (standards, skills, users, operational), course selection, and structural proposals considered…
DOT National Transportation Integrated Search
2010-08-31
In 2007 the Alabama Department of Transportation (ALDOT) in cooperation with the Montgomery Area : Metropolitan Planning Organization (MPO) and Auburn University initiated a research project to explore : the potential of developing an integrated tran...
Wang, Yang; Xu, Zhidong; Mao, Jian -Hua; ...
2015-06-08
Background: Lung cancer is the leading cause of morbidity and death worldwide. Although the available lung cancer animal models have been informative and further propel our understanding of human lung cancer, they still do not fully recapitulate the complexities of human lung cancer. The pathogenesis of lung cancer remains highly elusive because of its aggressive biologic nature and considerable heterogeneity, compared to other cancers. The association of Cul4A amplification with aggressive tumor growth and poor prognosis has been suggested. Our previous study suggested that Cul4A is oncogenic in vitro, but its oncogenic role in vivo has not been studied. Methods:more » Viral delivery approaches have been used extensively to model cancer in mouse models. In our experiments, we used Cre-recombinase induced overexpression of the Cul4A gene in transgenic mice to study the role of Cul4A on lung tumor initiation and progression and have developed a new model of lung tumor development in mice harboring a conditionally expressed allele of Cul4A. Results: Here we show that the use of a recombinant adenovirus expressing Cre-recombinase (“AdenoCre”) to induce Cul4A overexpression in the lungs of mice allows controls of the timing and multiplicity of tumor initiation. Following our mouse models, we are able to study the potential role of Cul4A in the development and progression in pulmonary adenocarcinoma as well. Conclusion: Our findings indicate that Cul4A is oncogenic in vivo, and this mouse model is a tool in understanding the mechanisms of Cul4A in human cancers and for testing experimental therapies targeting Cul4A.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yang; Xu, Zhidong; Mao, Jian -Hua
Background: Lung cancer is the leading cause of morbidity and death worldwide. Although the available lung cancer animal models have been informative and further propel our understanding of human lung cancer, they still do not fully recapitulate the complexities of human lung cancer. The pathogenesis of lung cancer remains highly elusive because of its aggressive biologic nature and considerable heterogeneity, compared to other cancers. The association of Cul4A amplification with aggressive tumor growth and poor prognosis has been suggested. Our previous study suggested that Cul4A is oncogenic in vitro, but its oncogenic role in vivo has not been studied. Methods:more » Viral delivery approaches have been used extensively to model cancer in mouse models. In our experiments, we used Cre-recombinase induced overexpression of the Cul4A gene in transgenic mice to study the role of Cul4A on lung tumor initiation and progression and have developed a new model of lung tumor development in mice harboring a conditionally expressed allele of Cul4A. Results: Here we show that the use of a recombinant adenovirus expressing Cre-recombinase (“AdenoCre”) to induce Cul4A overexpression in the lungs of mice allows controls of the timing and multiplicity of tumor initiation. Following our mouse models, we are able to study the potential role of Cul4A in the development and progression in pulmonary adenocarcinoma as well. Conclusion: Our findings indicate that Cul4A is oncogenic in vivo, and this mouse model is a tool in understanding the mechanisms of Cul4A in human cancers and for testing experimental therapies targeting Cul4A.« less
Bowers, Janice E.
1996-01-01
Should a platyopuntia expend all aerolar meristems in flower production, now new cladodes could be produced, and further reproductive effort and vegetative growth would cease. To investigate the trade-off between flower and cladode production, the numbers of flowers, fruits, and cladodes were monitored for 4 years on 30 Opuntia engelmannii Salm-Dyek, plants on Tumamoc Hill, Tucson, Arizona. Plant size controlled the number of flowers initiated each spring. The proportion of flowers that developed (i.e., did not abort) was perhaps determined by December-February rainfall in the months before bloom, with more being developed in the wettest years. Models based on different ratios of initiated cladodes to initiated flowers demonstrated that continued high investment in flowers and fruits would eventually terminate reproduction altogether; therefore periods of high sexual reproduction should alternate with periods of high vegetative growth. In the first 3 years of this study, the ratio of new cladodes to initiated flowers was low, showing a high investment in sexual reproduction. As suggested by the model, the population recouped this investment in the fourth year, when the number of new cladodes was nearly 3 times the 1992-1994 mean, and the number of initiated flowers was only 73% of the 3-year mean.
Subduction initiation and Obduction: insights from analog models
NASA Astrophysics Data System (ADS)
Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.
2013-12-01
Subduction initiation and obduction are two poorly constrained geodynamic processes which are interrelated in a number of natural settings. Subduction initiation can be viewed as the result of a regional-scale change in plate convergence partitioning between the set of existing subduction (and collision or obduction) zones worldwide. Intraoceanic subduction initiation may also ultimately lead to obduction of dense oceanic "ophiolites" atop light continental plates. A classic example is the short-lived Peri-Arabic obduction, which took place along thousands of km almost synchronously (within ~5-10 myr), from Turkey to Oman, while the subduction zone beneath Eurasia became temporarily jammed. We herein present analog models designed to study both processes and more specifically (1) subduction initiation through the partitioning of deformation between two convergent zones (a preexisting and a potential one) and, as a consequence, (2) the possible development of obduction, which has so far never been modeled. These models explore the mechanisms of subduction initiation and obduction and test various triggering hypotheses (i.e., plate acceleration, slab crossing the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises an upper mantle modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and high-viscosity silicone plates. Convergence is simulated by pushing on a piston at one end of the model with plate tectonics like velocities (1-10 cm/yr) onto (i) a continental margin, (ii) a weakness zone with variable resistance and dip (W), (iii) an oceanic plate - with or without a spreading ridge, (iv) a subduction zone (S) dipping away from the piston and (v) an upper active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as for the Oman case). Several configurations were tested over thirty-five parametric experiments. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Measurements of displacements and internal deformation allow for a very precise and reproducible tracking of deformation. Experiments consistently demonstrate that subduction initiation chiefly depends on how the overall shortening (or convergence) is partitionned between the weakness zone (W) and the preexisting subduction zone (S). Part of the deformation is transfered to W as soon as the increased coupling across S results in 5-10% of the convergence being transfered to the upper plate. Whether obduction develops further depends on the effective strength of W. Results (1) constrain the range of physical conditions required for subduction initiation and obduction to develop/nucleate and (2) underline the key role of acceleration for triggering obduction, rather than ridge subduction or slab resistance to penetration at the 660 km discontinuity. [Agard P., Jolivet L., Vrielynck B., Burov E. & Monié P., 2007. Plate acceleration : the obduction trigger? Earth and Planetary Science Letters, 258, 428-441.
Initial development of 5D COGENT
NASA Astrophysics Data System (ADS)
Cohen, R. H.; Lee, W.; Dorf, M.; Dorr, M.
2015-11-01
COGENT is a continuum gyrokinetic edge code being developed by the by the Edge Simulation Laboratory (ESL) collaboration. Work to date has been primarily focussed on a 4D (axisymmetric) version that models transport properties of edge plasmas. We have begun development of an initial 5D version to study edge turbulence, with initial focus on kinetic effects on blob dynamics and drift-wave instability in a shearless magnetic field. We are employing compiler directives and preprocessor macros to create a single source code that can be compiled in 4D or 5D, which helps to ensure consistency of physics representation between the two versions. A key aspect of COGENT is the employment of mapped multi-block grid capability to handle the complexity of diverter geometry. It is planned to eventually exploit this capability to handle magnetic shear, through a series of successively skewed unsheared grid blocks. The initial version has an unsheared grid and will be used to explore the degree to which a radial domain must be block decomposed. We report on the status of code development and initial tests. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Jing, R.; Lin, N.; Emanuel, K.; Vecchi, G. A.; Knutson, T. R.
2017-12-01
A Markov environment-dependent hurricane intensity model (MeHiM) is developed to simulate the climatology of hurricane intensity given the surrounding large-scale environment. The model considers three unobserved discrete states representing respectively storm's slow, moderate, and rapid intensification (and deintensification). Each state is associated with a probability distribution of intensity change. The storm's movement from one state to another, regarded as a Markov chain, is described by a transition probability matrix. The initial state is estimated with a Bayesian approach. All three model components (initial intensity, state transition, and intensity change) are dependent on environmental variables including potential intensity, vertical wind shear, midlevel relative humidity, and ocean mixing characteristics. This dependent Markov model of hurricane intensity shows a significant improvement over previous statistical models (e.g., linear, nonlinear, and finite mixture models) in estimating the distributions of 6-h and 24-h intensity change, lifetime maximum intensity, and landfall intensity, etc. Here we compare MeHiM with various dynamical models, including a global climate model [High-Resolution Forecast-Oriented Low Ocean Resolution model (HiFLOR)], a regional hurricane model (Geophysical Fluid Dynamics Laboratory (GFDL) hurricane model), and a simplified hurricane dynamic model [Coupled Hurricane Intensity Prediction System (CHIPS)] and its newly developed fast simulator. The MeHiM developed based on the reanalysis data is applied to estimate the intensity of simulated storms to compare with the dynamical-model predictions under the current climate. The dependences of hurricanes on the environment under current and future projected climates in the various models will also be compared statistically.
Temporal Evolution of the Plasma Sheath Surrounding Solar Cells in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Willis, Emily M.; Pour, Maria Z. A.
2017-01-01
Initial results from the PIC simulation and the LEM simulation have been presented. The PIC simulation results show that more detailed study is required to refine the ISS solar array current collection model and to understand the development of the current collection in time. The initial results from the LEM demonstrate that is it possible the transients are caused by solar array interaction with the environment, but there are presently too many assumptions in the model to be certain. Continued work on the PIC simulation will provide valuable information on the development of the barrier potential, which will allow refinement the LEM simulation and a better understanding of the causes and effects of the transients.
Klika, Václav; Gaffney, Eamonn A; Chen, Ying-Chun; Brown, Cameron P
2016-09-01
There is a long history of mathematical and computational modelling with the objective of understanding the mechanisms governing cartilage׳s remarkable mechanical performance. Nonetheless, despite sophisticated modelling development, simulations of cartilage have consistently lagged behind structural knowledge and thus the relationship between structure and function in cartilage is not fully understood. However, in the most recent generation of studies, there is an emerging confluence between our structural knowledge and the structure represented in cartilage modelling. This raises the prospect of further refinement in our understanding of cartilage function and also the initiation of an engineering-level understanding for how structural degradation and ageing relates to cartilage dysfunction and pathology, as well as informing the potential design of prospective interventions. Aimed at researchers entering the field of cartilage modelling, we thus review the basic principles of cartilage models, discussing the underlying physics and assumptions in relatively simple settings, whilst presenting the derivation of relatively parsimonious multiphase cartilage models consistent with our discussions. We proceed to consider modern developments that start aligning the structure captured in the models with observed complexities. This emphasises the challenges associated with constitutive relations, boundary conditions, parameter estimation and validation in cartilage modelling programmes. Consequently, we further detail how both experimental interrogations and modelling developments can be utilised to investigate and reduce such difficulties before summarising how cartilage modelling initiatives may improve our understanding of cartilage ageing, pathology and intervention. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Calibration strategies for a groundwater model in a highly dynamic alpine floodplain
Foglia, L.; Burlando, P.; Hill, Mary C.; Mehl, S.
2004-01-01
Most surface flows to the 20-km-long Maggia Valley in Southern Switzerland are impounded and the valley is being investigated to determine environmental flow requirements. The aim of the investigation is the devel-opment of a modelling framework that simulates the dynamics of the ground-water, hydrologic, and ecologic systems. Because of the multi-scale nature of the modelling framework, large-scale models are first developed to provide the boundary conditions for more detailed models of reaches that are of eco-logical importance. We describe here the initial (large-scale) groundwa-ter/surface water model and its calibration in relation to initial and boundary conditions. A MODFLOW-2000 model was constructed to simulate the inter-action of groundwater and surface water and was developed parsimoniously to avoid modelling artefacts and parameter inconsistencies. Model calibration includes two steady-state conditions, with and without recharge to the aquifer from the adjoining hillslopes. Parameters are defined to represent areal re-charge, hydraulic conductivity of the aquifer (up to 5 classes), and streambed hydraulic conductivity. Model performance was investigated following two system representation. The first representation assumed unknown flow input at the northern end of the groundwater domain and unknown lateral inflow. The second representation used simulations of the lateral flow obtained by means of a raster-based, physically oriented and continuous in time rainfall-runoff (R-R) model. Results based on these two representations are compared and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mankovich, N.J.; Lambert, T.; Zrimec, T.
A project is underway to develop automated methods of fusing cerebral magnetic resonance angiography (MRA) and x-ray angiography (XRA) for creating accurate visualizations used in planning treatment of vascular disease. The authors have developed a vascular phantom suitable for testing segmentation and fusion algorithms with either derived images (pseudo-MRA/pseudo-XRA) or actual MRA or XRA image sequences. The initial unilateral arterial phantom design, based on normal human anatomy, contains 48 tapering vascular segments with lumen diameters from 2.5 millimeter to 0.25 millimeter. The initial phantom used rapid prototyping technology (stereolithography) with a 0.9 millimeter vessel wall fabricated in an ultraviolet-cured plastic.more » The model fabrication resulted in a hollow vessel model comprising the internal carotid artery, the ophthalmic artery, and the proximal segments of the anterior, middle, and posterior cerebral arteries. The complete model was fabricated but the model`s lumen could not be cleared for vessels with less than 1 millimeter diameter. Measurements of selected vascular outer diameters as judged against the CAD specification showed an accuracy of 0.14 mm and precision (standard deviation) of 0.15 mm. The plastic vascular model produced provides a fixed geometric framework for the evaluation of imaging protocols and the development of algorithms for both segmentation and fusion.« less
NASA AVOSS Fast-Time Wake Prediction Models: User's Guide
NASA Technical Reports Server (NTRS)
Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew
2014-01-01
The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.
From good ideas to actions: a model-driven community collaborative to prevent childhood obesity.
Huberty, Jennifer L; Balluff, Mary; O'Dell, Molly; Peterson, Kerri
2010-01-01
Activate Omaha Kids, a community collaborative, was designed, implemented, and evaluated with the aim of preventing childhood obesity in the Omaha community. Activate Omaha Kids brought together key stakeholders and community leaders to create a community coalition. The coalition's aim was to oversee a long-term sustainable approach to preventing obesity. Following a planning phase, a business plan was developed that prioritized best practices to be implemented in Omaha. The business plan was developed using the Ecological Model, Health Policy Model, and Robert Wood Johnson Foundation Active Living by Design 5P model. The three models helped the community identify target populations and activities that then created a single model for sustainable change. Twenty-four initiatives were identified, over one million dollars in funding was secured, and evaluation strategies were identified. By using the models from the initial steps through evaluation, a clear facilitation of the process was possible, and the result was a comprehensive, feasible plan. The use of the models to design a strategic plan was pivotal in building a sustainable coalition to achieve measurable improvements in the health of children and prove replicable over time.
Active numerical model of human body for reconstruction of falls from height.
Milanowicz, Marcin; Kędzior, Krzysztof
2017-01-01
Falls from height constitute the largest group of incidents out of approximately 90,000 occupational accidents occurring each year in Poland. Reconstruction of the exact course of a fall from height is generally difficult due to lack of sufficient information from the accident scene. This usually results in several contradictory versions of an incident and impedes, for example, determination of the liability in a judicial process. In similar situations, in many areas of human activity, researchers apply numerical simulation. They use it to model physical phenomena to reconstruct their real course over time; e.g. numerical human body models are frequently used for investigation and reconstruction of road accidents. However, they are validated in terms of specific road traffic accidents and are considerably limited when applied to the reconstruction of other types of accidents. The objective of the study was to develop an active numerical human body model to be used for reconstruction of accidents associated with falling from height. Development of the model involved extension and adaptation of the existing Pedestrian human body model (available in the MADYMO package database) for the purposes of reconstruction of falls from height by taking into account the human reaction to the loss of balance. The model was developed by using the results of experimental tests of the initial phase of the fall from height. The active numerical human body model covering 28 sets of initial conditions related to various human reactions to the loss of balance was developed. The application of the model was illustrated by using it to reconstruct a real fall from height. From among the 28 sets of initial conditions, those whose application made it possible to reconstruct the most probable version of the incident was selected. The selection was based on comparison of the results of the reconstruction with information contained in the accident report. Results in the form of estimated injuries overlap with the real injuries sustained by the casualty. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr
2016-03-01
The overall objective of this project was to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics and developing rigorous mathematical techniques and computational algorithms to study such models. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals.
Gupta, Rajesh; Patel, Rajan; Murty, Naganand; Panicker, Rahul; Chen, Jane
2015-02-01
Relative to drugs, diagnostics, and vaccines, efforts to develop other global health technologies, such as medical devices, are limited and often focus on the short-term goal of prototype development instead of the long-term goal of a sustainable business model. To develop a medical device to address neonatal hypothermia for use in resource-limited settings, we turned to principles of design theory: (1) define the problem with consideration of appropriate integration into relevant health policies, (2) identify the users of the technology and the scenarios in which the technology would be used, and (3) use a highly iterative product design and development process that incorporates the perspective of the user of the technology at the outset and addresses scalability. In contrast to our initial idea, to create a single device, the process guided us to create two separate devices, both strikingly different from current solutions. We offer insights from our initial experience that may be helpful to others engaging in global health technology development.
A numerical solution of the problem of crown forest fire initiation and spread
NASA Astrophysics Data System (ADS)
Marzaeva, S. I.; Galtseva, O. V.
2018-05-01
Mathematical model of forest fire was based on an analysis of known experimental data and using concept and methods from reactive media mechanics. The study takes in to account the mutual interaction of the forest fires and three-dimensional atmosphere flows. The research is done by means of mathematical modeling of physical processes. It is based on numerical solution of Reynolds equations for chemical components and equations of energy conservation for gaseous and condensed phases. It is assumed that the forest during a forest fire can be modeled as a two-temperature multiphase non-deformable porous reactive medium. A discrete analog for the system of equations was obtained by means of the control volume method. The developed model of forest fire initiation and spreading would make it possible to obtain a detailed picture of the variation in the velocity, temperature and chemical species concentration fields with time. Mathematical model and the result of the calculation give an opportunity to evaluate critical conditions of the forest fire initiation and spread which allows applying the given model for of means for preventing fires.
Larkindale, Jane; Abresch, Richard; Aviles, Enrique; Bronson, Abby; Chin, Janice; Furlong, Pat; Gordish-Dressman, Heather; Habeeb-Louks, Elizabeth; Henricson, Erik; Kroger, Hans; Lynn, Charles; Lynn, Stephen; Martin, Dana; Nuckolls, Glen; Rooney, William; Romero, Klaus; Sweeney, Lee; Vandenborne, Krista; Walter, Glenn; Wolff, Jodi; Wong, Brenda; McDonald, Craig M; Duchenne Regulatory Science Consortium Imaging-Dmd Consortium And The Cinrg Investigators, Members Of The
2017-01-12
The Duchenne Regulatory Science Consortium (D-RSC) was established to develop tools to accelerate drug development for DMD. The resulting tools are anticipated to meet validity requirements outlined by qualification/endorsement pathways at both the U.S. Food and Drug Administration (FDA) and European Medicines Administration (EMA), and will be made available to the drug development community. The initial goals of the consortium include the development of a disease progression model, with the goal of creating a model that would be used to forecast changes in clinically meaningful endpoints, which would inform clinical trial protocol development and data analysis. Methods: In April of 2016 the consortium and other experts met to formulate plans for the development of the model. Conclusions: Here we report the results of the meeting, and discussion as to the form of the model that we plan to move forward to develop, after input from the regulatory authorities.
NASA Technical Reports Server (NTRS)
Hanagud, S.; Uppaluri, B.
1975-01-01
This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.
Morales Urrea, Diego Alberto; Haure, Patricia Mónica; García Einschlag, Fernando Sebastián; Contreras, Edgardo Martín
2018-05-09
Enzymatic decolourization of azo-dyes could be a cost-competitive alternative compared to physicochemical or microbiological methods. Stoichiometric and kinetic features of peroxidase-mediated decolourization of azo-dyes by hydrogen peroxide (P) are central for designing purposes. In this work, a modified version of the Dunford mechanism of peroxidases was developed. The proposed model takes into account the inhibition of peroxidases by high concentrations of P, the substrate-dependant catalatic activity of peroxidases (e.g. the decomposition of P to water and oxygen), the generation of oxidation products (OP) and the effect of pH on the decolourization kinetics of the azo-dye Orange II (OII). To obtain the parameters of the proposed model, two series of experiments were performed. In the first set, the effects of initial P concentration (0.01-0.12 mM) and pH (5-10) on the decolourization degree were studied at a constant initial OII concentration (0.045 mM). Obtained results showed that at pH 9-10 and low initial P concentrations, the consumption of P was mainly to oxidize OII. From the proposed model, an expression for the decolourization degree was obtained. In the second set of experiments, the effect of the initial concentrations of OII (0.023-0.090 mM), P (0.02-4.7 mM), HRP (34-136 mg/L) and pH (5-10) on the initial specific decolourization rate (q 0 ) was studied. As a general rule, a noticeable increase in q 0 was observed for pHs higher than 7. For a given pH, q 0 increased as a function of the initial OII concentration. Besides, there was an inhibitory effect of high P concentrations on q 0 . To asses the possibility of reusing the enzyme, repeated additions of OII and P were performed. Results showed that the enzyme remained active after six reuse cycles. A satisfactory accordance between the change of the absorbance during these experiments and absorbances calculated using the proposed model was obtained. Considering that this set of data was not used during the fitting procedure of the model, the agreement between predicted and experimental absorbances provides a powerful validation of the model developed in the present work.
An Approach to Verification and Validation of a Reliable Multicasting Protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1994-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
An approach to verification and validation of a reliable multicasting protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
Symons, Jennifer E; Fyhrie, David P; Hawkins, David A; Upadhyaya, Shrinivasa K; Stover, Susan M
2015-02-26
Race surfaces have been associated with the incidence of racehorse musculoskeletal injury, the leading cause of racehorse attrition. Optimal race surface mechanical behaviors that minimize injury risk are unknown. Computational models are an economical method to determine optimal mechanical behaviors. Previously developed equine musculoskeletal models utilized ground reaction floor models designed to simulate a stiff, smooth floor appropriate for a human gait laboratory. Our objective was to develop a computational race surface model (two force-displacement functions, one linear and one nonlinear) that reproduced experimental race surface mechanical behaviors for incorporation in equine musculoskeletal models. Soil impact tests were simulated in a musculoskeletal modeling environment and compared to experimental force and displacement data collected during initial and repeat impacts at two racetracks with differing race surfaces - (i) dirt and (ii) synthetic. Best-fit model coefficients (7 total) were compared between surface types and initial and repeat impacts using a mixed model ANCOVA. Model simulation results closely matched empirical force, displacement and velocity data (Mean R(2)=0.930-0.997). Many model coefficients were statistically different between surface types and impacts. Principal component analysis of model coefficients showed systematic differences based on surface type and impact. In the future, the race surface model may be used in conjunction with previously developed the equine musculoskeletal models to understand the effects of race surface mechanical behaviors on limb dynamics, and determine race surface mechanical behaviors that reduce the incidence of racehorse musculoskeletal injury through modulation of limb dynamics. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
James, I.
2016-10-01
Electro-optically (EO) guided surface to air missiles (SAM) have developed to use Ultraviolet (UV) wavebands supplementary to the more common Infrared (IR) wavebands. Missiles such as the US Stinger have been around for some time but are not considered a proliferation risk. The Chinese FN-16 and Russian SA-29 (Verba) are considered a much higher proliferation risk. As a result, models of the missile seekers must be developed to understand the characteristics of the seeker and the potential performance enhancement that are included. Therefore, the purpose of this paper is to introduce the steps that have been taken to characterise and model these missiles. It begins by outlining some of the characteristics of the threats, the key elements of a UV scene, the potential choice of waveband for a detector, the initial modelling work to represent the UV detector of the missile and presents initial results. The modelling shows that the UV detection range of a typical aircraft is dependent on both the size of the aircraft and its reflectivity. However, the strength of this correlation is less than expected. As a result, further work is required to model more seeker types and to investigate what is causing the weak correlations found in these initial investigations. In addition, there needs to be further study of the sensitivities of the model to other variables, such as the modelled detectivity of the detector and the signal to noise ratio assumed. Overall, the outcome of this work will be to provide specifications for aircraft size and reflectivity that limit the effectiveness of the UV channels.
Magnetohydrodynamic modelling of exploding foil initiators
NASA Astrophysics Data System (ADS)
Neal, William
2015-06-01
Magnetohydrodynamic (MHD) codes are currently being developed, and used, to predict the behaviour of electrically-driven flyer-plates. These codes are of particular interest to the design of exploding foil initiator (EFI) detonators but there is a distinct lack of comparison with high-fidelity experimental data. This study aims to compare a MHD code with a collection of temporally and spatially resolved diagnostics including PDV, dual-axis imaging and streak imaging. The results show the code's excellent representation of the flyer-plate launch and highlight features within the experiment that the model fails to capture.
NASA Technical Reports Server (NTRS)
Vlahopoulos, Nickolas; Lyle, Karen H.; Burley, Casey L.
1998-01-01
An algorithm for generating appropriate velocity boundary conditions for an acoustic boundary element analysis from the kinematics of an operating propeller is presented. It constitutes the initial phase of Integrating sophisticated rotorcraft models into a conventional boundary element analysis. Currently, the pressure field is computed by a linear approximation. An initial validation of the developed process was performed by comparing numerical results to test data for the external acoustic pressure on the surface of a tilt-rotor aircraft for one flight condition.
Initial test of MITA/DIMM with an operational CBP system
NASA Astrophysics Data System (ADS)
Baldwin, Kevin; Hanna, Randall; Brown, Andrea; Brown, David; Moyer, Steven; Hixson, Jonathan G.
2018-05-01
The MITA (Motion Imagery Task Analyzer) project was conceived by CBP OA (Customs and Border Protection - Office of Acquisition) and executed by JHU/APL (Johns Hopkins University/Applied Physics Laboratory) and CERDEC NVESD MSD (Communications and Electronics Research Development Engineering Command Night Vision and Electronic Sensors Directorate Modeling and Simulation Division). The intent was to develop an efficient methodology whereby imaging system performance could be quickly and objectively characterized in a field setting. The initial design, development, and testing spanned a period of approximately 18 months with the initial project coming to a conclusion after testing of the MITA system in June 2017 with a fielded CBP system. The NVESD contribution to MITA was thermally heated target resolution boards deployed to support a range close to the sensor and, when possible, at range with the targets of interest. JHU/APL developed a laser DIMM (Differential Image Motion Monitor) system designed to measure the optical turbulence present along the line of sight of the imaging system during the time of image collection. The imagery collected of the target board was processed to calculate the in situ system resolution. This in situ imaging system resolution and the time-correlated turbulence measured by the DIMM system were used in NV-IPM (Night Vision Integrated Performance Model) to calculate the theoretical imaging system performance. Overall, this proves the MITA concept feasible. However, MITA is still in the initial phases of development and requires further verification and validation to ensure accuracy and reliability of both the instrument and the imaging system performance predictions.
Associating putative molecular initiating events (MIE) with downstream cell signaling pathways and modeling fetal exposure kinetics is an important challenge for integration in developmental systems toxicology. Here, we describe an integrative systems toxicology model for develop...
NASA Astrophysics Data System (ADS)
Scherr, Rachel E.; Robertson, Amy D.
2015-06-01
We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a byproduct of individual particle collisions, which is represented in science education research literature as an obstacle to learning. We demonstrate that in this instructional context, the idea that individual particle collisions generate thermal energy is not an obstacle to learning, but instead is productive: it initiates intellectual progress. Specifically, this idea initiates the reconciliation of the teachers' energy model with mechanistic reasoning about adiabatic compression, and leads to a canonically correct model of the transformation of kinetic energy into thermal energy. We claim that the idea's productivity is influenced by features of our particular instructional context, including the instructional goals of the course, the culture of collaborative sense making, and the use of certain representations of energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uematsu, Hitoshi; Yamamoto, Toru; Izutsu, Sadayuki
1990-06-01
A reactivity-initiated event is a design-basis accident for the safety analysis of boiling water reactors. It is defined as a rapid transient of reactor power caused by a reactivity insertion of over $1.0 due to a postulated drop or abnormal withdrawal of the control rod from the core. Strong space-dependent feedback effects are associated with the local power increase due to control rod movement. A realistic treatment of the core status in a transient by a code with a detailed core model is recommended in evaluating this event. A three-dimensional transient code, ARIES, has been developed to meet this need.more » The code simulates the event with three-dimensional neutronics, coupled with multichannel thermal hydraulics, based on a nonequilibrium separated flow model. The experimental data obtained in reactivity accident tests performed with the SPERT III-E core are used to verify the entire code, including thermal-hydraulic models.« less
Translation and cultural adaptation for Brazil of the Developing Nurses' Thinking model1
Jensen, Rodrigo; da Cruz, Diná de Almeida Lopes Monteiro; Tesoro, Mary Gay; Lopes, Maria Helena Baena de Moraes
2014-01-01
Objectives to translate and culturally adapt to Brazilian Portuguese the Developing Nurses' Thinking model, used as a strategy for teaching clinical reasoning. Method the translation and cultural adaptation were undertaken through initial translation, synthesis of the translations, back-translation, evaluation by a committee of specialists and a pre-test with 33 undergraduate nursing students. Results the stages of initial translation, synthesis of the translations and back-translation were undertaken satisfactorily, small adjustments being needed. In the evaluation of the translated version by the committee of specialists, all the items obtained agreement over 80% in the first round of evaluation and in the pre-test with the students, so the model was shown to be fit for purpose. Conclusion the use of the model as a complementary strategy in the teaching of diagnostic reasoning is recommended, with a view to the training of nurses who are more aware regarding the diagnostic task and the importance of patient safety. PMID:26107825
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kress, Joel David
The development and scale up of cost effective carbon capture processes is of paramount importance to enable the widespread deployment of these technologies to significantly reduce greenhouse gas emissions. The U.S. Department of Energy initiated the Carbon Capture Simulation Initiative (CCSI) in 2011 with the goal of developing a computational toolset that would enable industry to more effectively identify, design, scale up, operate, and optimize promising concepts. The first half of the presentation will introduce the CCSI Toolset consisting of basic data submodels, steady-state and dynamic process models, process optimization and uncertainty quantification tools, an advanced dynamic process control framework,more » and high-resolution filtered computationalfluid- dynamics (CFD) submodels. The second half of the presentation will describe a high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO 2 capture. The sorbent model includes a detailed treatment of transport and amine-CO 2- H 2O interactions based on quantum chemistry calculations. Using a Bayesian approach for uncertainty quantification, we calibrate the sorbent model to Thermogravimetric (TGA) data.« less
Atmospheric model development in support of SEASAT. Volume 2: Analysis models
NASA Technical Reports Server (NTRS)
Langland, R. A.
1977-01-01
As part of the SEASAT program of NASA, two sets of analysis programs were developed for the Jet Propulsion Laboratory. One set of programs produce 63 x 63 horizontal mesh analyses on a polar stereographic grid. The other set produces 187 x 187 third mesh analyses. The parameters analyzed include sea surface temperature, sea level pressure and twelve levels of upper air temperature, height and wind analyses. The analysis output is used to initialize the primitive equation forecast models.
2017-06-06
environments may be injured or killed from the primary blast wave, thermal pulse and ionizing radiation . Burn casualties surviving the initial blast wave are...32]/1.8 degree Celsius (oC) degree Fahrenheit (oF) [T(oF) + 459.67]/1.8 kelvin (K) Radiation activity of radionuclides [curie (Ci)] 3.7 × 1010...develop casualty estimation models for improvised nuclear device (IND) scenarios. The HSRDIPT team has developed health effects models of radiation , burn
Atmosphere Behavior in Gas-Closed Mouse-Algal Systems: An Experimental and Modelling Study
NASA Technical Reports Server (NTRS)
Averner, M. M.; Moore, B., III; Bartholomew, I.; Wharton, R.
1985-01-01
A dual approach of mathematical modelling and laboratory experimentation aimed at examining the gas exchange characteristics of artificial animal/plant systems closed to the ambient atmosphere was initiated. The development of control techniques and management strategies for maintaining the atmospheric levels of carbon dioxide and oxygen at physiological levels is examined. A mathematical model simulating the atmospheric behavior in these systems was developed and an experimental gas closed system was constructed. These systems are described and preliminary results are presented.
Developing a laser shockwave model for characterizing diffusion bonded interfaces
NASA Astrophysics Data System (ADS)
Lacy, Jeffrey M.; Smith, James A.; Rabin, Barry H.
2015-03-01
The US National Nuclear Security Agency has a Global Threat Reduction Initiative (GTRI) with the goal of reducing the worldwide use of high-enriched uranium (HEU). A salient component of that initiative is the conversion of research reactors from HEU to low enriched uranium (LEU) fuels. An innovative fuel is being developed to replace HEU in high-power research reactors. The new LEU fuel is a monolithic fuel made from a U-Mo alloy foil encapsulated in Al-6061 cladding. In order to support the fuel qualification process, the Laser Shockwave Technique (LST) is being developed to characterize the clad-clad and fuel-clad interface strengths in fresh and irradiated fuel plates. LST is a non-contact method that uses lasers for the generation and detection of large amplitude acoustic waves to characterize interfaces in nuclear fuel plates. However, because the deposition of laser energy into the containment layer on a specimen's surface is intractably complex, the shock wave energy is inferred from the surface velocity measured on the backside of the fuel plate and the depth of the impression left on the surface by the high pressure plasma pulse created by the shock laser. To help quantify the stresses generated at the interfaces, a finite element method (FEM) model is being utilized. This paper will report on initial efforts to develop and validate the model by comparing numerical and experimental results for back surface velocities and front surface depressions in a single aluminum plate representative of the fuel cladding.
NASA Technical Reports Server (NTRS)
Klumpar, D. M. (Principal Investigator)
1982-01-01
Efforts in support of the development of a model of the magnetic fields due to ionospheric and magnetospheric electrical currents are discussed. Specifically, progress made in reading MAGSAT tapes and plotting the deviation of the measured magnetic field components with respect to a spherical harmonic model of the main geomagnetic field is reported. Initial tests of the modeling procedure developed to compute the ionosphere/magnetosphere-induced fields at satellite orbit are also described. The modeling technique utilizes a liner current element representation of the large scale current system.
Modeling of circulating fluised beds for post-combustion carbon capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, A.; Shadle, L.; Miller, D.
2011-01-01
A compartment based model for a circulating fluidized bed reactor has been developed based on experimental observations of riser hydrodynamics. The model uses a cluster based approach to describe the two-phase behavior of circulating fluidized beds. Fundamental mass balance equations have been derived to describe the movement of both gas and solids though the system. Additional work is being performed to develop the correlations required to describe the hydrodynamics of the system. Initial testing of the model with experimental data shows promising results and highlights the importance of including end effects within the model.
Tisa, Farhana; Davoody, Meysam; Abdul Raman, Abdul Aziz; Daud, Wan Mohd Ashri Wan
2015-01-01
The efficiency of phenol degradation via Fenton reaction using mixture of heterogeneous goethite catalyst with homogeneous ferrous ion was analyzed as a function of three independent variables, initial concentration of phenol (60 to 100 mg /L), weight ratio of initial concentration of phenol to that of H2O2 (1: 6 to 1: 14) and, weight ratio of initial concentration of goethite catalyst to that of H2O2 (1: 0.3 to 1: 0.7). More than 90 % of phenol removal and more than 40% of TOC removal were achieved within 60 minutes of reaction. Two separate models were developed using artificial neural networks to predict degradation percentage by a combination of Fe3+ and Fe2+ catalyst. Five operational parameters were employed as inputs while phenol degradation and TOC removal were considered as outputs of the developed models. Satisfactory agreement was observed between testing data and the predicted values (R2 Phenol = 0.9214 and R2TOC= 0.9082). PMID:25849556
Combat Wound Initiative program.
Stojadinovic, Alexander; Elster, Eric; Potter, Benjamin K; Davis, Thomas A; Tadaki, Doug K; Brown, Trevor S; Ahlers, Stephen; Attinger, Christopher E; Andersen, Romney C; Burris, David; Centeno, Jose; Champion, Hunter; Crumbley, David R; Denobile, John; Duga, Michael; Dunne, James R; Eberhardt, John; Ennis, William J; Forsberg, Jonathan A; Hawksworth, Jason; Helling, Thomas S; Lazarus, Gerald S; Milner, Stephen M; Mullick, Florabel G; Owner, Christopher R; Pasquina, Paul F; Patel, Chirag R; Peoples, George E; Nissan, Aviram; Ring, Michael; Sandberg, Glenn D; Schaden, Wolfgang; Schultz, Gregory S; Scofield, Tom; Shawen, Scott B; Sheppard, Forest R; Stannard, James P; Weina, Peter J; Zenilman, Jonathan M
2010-07-01
The Combat Wound Initiative (CWI) program is a collaborative, multidisciplinary, and interservice public-private partnership that provides personalized, state-of-the-art, and complex wound care via targeted clinical and translational research. The CWI uses a bench-to-bedside approach to translational research, including the rapid development of a human extracorporeal shock wave therapy (ESWT) study in complex wounds after establishing the potential efficacy, biologic mechanisms, and safety of this treatment modality in a murine model. Additional clinical trials include the prospective use of clinical data, serum and wound biomarkers, and wound gene expression profiles to predict wound healing/failure and additional clinical patient outcomes following combat-related trauma. These clinical research data are analyzed using machine-based learning algorithms to develop predictive treatment models to guide clinical decision-making. Future CWI directions include additional clinical trials and study centers and the refinement and deployment of our genetically driven, personalized medicine initiative to provide patient-specific care across multiple medical disciplines, with an emphasis on combat casualty care.
Rural Development in the United States: Connecting Theory, Practice, and Possibilities.
ERIC Educational Resources Information Center
Galston, William A.; Baehler, Karen J.
This book synthesizes and analyzes much of the theoretical and practical literature on rural economic development and related issues from the past two decades with the aim of initiating construction of a new model for U.S. rural development policy. Part I emphasizes the national and global context within which U.S. rural development must take…
ERIC Educational Resources Information Center
van der Lans, Rikkert M.; van de Grift, Wim J. C. M.; van Veen, K.
2018-01-01
This study connects descriptions of effective teaching with descriptions of teacher development to advance an initial understanding of how effective teaching may develop. The study's main premise is that descriptions of effective teaching develop cumulatively where more basic teaching strategies and behaviors are required before teachers may…
NASA Astrophysics Data System (ADS)
Mirtadjieva, K. T.; Nuritdinov, S. N.; Ruzibaev, J. K.; Khalid, Muhammad
2011-06-01
This is an examination of the gravitational instability of the major large-scale perturbation modes for a fixed value of the azimuthal wave number m = 1 in nonlinearly nonstationary disk models with isotropic and anisotropic velocity diagrams for the purpose of explaining the displacement of the nucleus away from the geometric center (lopsidedness) in spiral galaxies. Nonstationary analogs of the dispersion relations for these perturbation modes are obtained. Critical diagrams of the initial virial ratio are constructed from the rotation parameters for the models in each case. A comparative analysis is made of the instability growth rates for the major horizontal perturbation modes in terms of two models, and it is found that, on the average, the instability growth rate for the m = 1 mode with a radial wave number N = 3 almost always has a clear advantage relative to the other modes. An analysis of these results shows that if the initial total kinetic energy in an isotropic model is no more than 12.4% of the initial potential energy, then, regardless of the value of the rotation parameter Ω, an instability of the radial motions always occurs and causes the nucleus to shift away from the geometrical center. This instability is aperiodic when Ω = 0 and is oscillatory when Ω ≠ 0 . For the anisotropic model, this kind of structure involving the nucleus develops when the initial total kinetic energy in the model is no more than 30.6% of the initial potential energy.
Interactions among Genes Regulating Ovule Development in Arabidopsis Thaliana
Baker, S. C.; Robinson-Beers, K.; Villanueva, J. M.; Gaiser, J. C.; Gasser, C. S.
1997-01-01
The INNER NO OUTER (INO) and AINTEGUMENTA (ANT) genes are essential for ovule integument development in Arabidopsis thaliana. Ovules of ino mutants initiate two integument primordia, but the outer integument primordium forms on the opposite side of the ovule from the normal location and undergoes no further development. The inner integument appears to develop normally, resulting in erect, unitegmic ovules that resemble those of gymnosperms. ino plants are partially fertile and produce seeds with altered surface topography, demonstrating a lineage dependence in development of the testa. ant mutations affect initiation of both integuments. The strongest of five new ant alleles we have isolated produces ovules that lack integuments and fail to complete megasporogenesis. ant mutations also affect flower development, resulting in narrow petals and the absence of one or both lateral stamens. Characterization of double mutants between ant, ino and other mutations affecting ovule development has enabled the construction of a model for genetic control of ovule development. This model proposes parallel independent regulatory pathways for a number of aspects of this process, a dependence on the presence of an inner integument for development of the embryo sac, and the existence of additional genes regulating ovule development. PMID:9093862
DOT National Transportation Integrated Search
2009-03-01
A research study was initiated by the Louisiana Department of Transportation and Development (LADOTD) in conjunction with the : Federal Highway Administration (FHWA) to evaluate the overall performance and eff ectiveness of LADOTDs Pavement Manage...
ERIC Educational Resources Information Center
Immaculate Heart Coll., Los Angeles, CA.
Immaculate Heart College, Los Angeles, California developed a self-initiated and self-directed curriculum in the Teacher Preparation Program. The curriculum was based on a spiral planning model. Emphasis was placed on continuous evaluation, exploration of the learning experience, development of experimental teacher training experiences in the…
Qian, Yun; Yan, Huiping; Berg, Larry K.; ...
2016-10-28
Accuracy of turbulence parameterization in representing Planetary Boundary Layer (PBL) processes in climate models is critical for predicting the initiation and development of clouds, air quality issues, and underlying surface-atmosphere-cloud interactions. In this study, we 1) evaluate WRF model-simulated spatial patterns of precipitation and surface fluxes, as well as vertical profiles of potential temperature, humidity, moist static energy and moisture tendency terms as simulated by WRF at various spatial resolutions and with PBL, surface layer and shallow convection schemes against measurements, 2) identify model biases by examining the moisture tendency terms contributed by PBL and convection processes through nudging experiments,more » and 3) evaluate the dependence of modeled surface latent heat (LH) fluxes onPBL and surface layer schemes over the tropical ocean. The results show that PBL and surface parameterizations have surprisingly large impacts on precipitation, convection initiation and surface moisture fluxes over tropical oceans. All of the parameterizations tested tend to overpredict moisture in PBL and free atmosphere, and consequently result in larger moist static energy and precipitation. Moisture nudging tends to suppress the initiation of convection and reduces the excess precipitation. The reduction in precipitation bias in turn reduces the surface wind and LH flux biases, which suggests that the model drifts at least partly because of a positive feedback between precipitation and surface fluxes. The updated shallow convection scheme KF-CuP tends to suppress the initiation and development of deep convection, consequently decreasing precipitation. The Eta surface layer scheme predicts more reasonable LH fluxes and the LH-Wind Speed relationship than the MM5 scheme, especially when coupled with the MYJ scheme. By examining various parameterization schemes in WRF, we identify sources of biases and weaknesses of current PBL, surface layer and shallow convection schemes in reproducing PBL processes, the initiation of convection and intra-seasonal variability of precipitation.« less
Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.
2017-01-01
Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135
Failed oceanic transform models: experience of shaking the tree
NASA Astrophysics Data System (ADS)
Gerya, Taras
2017-04-01
In geodynamics, numerical modeling is often used as a trial-and-error tool, which does not necessarily requires full understanding or even a correct concept for a modeled phenomenon. Paradoxically, in order to understand an enigmatic process one should simply try to model it based on some initial assumptions, which must not even be correct… The reason is that our intuition is not always well "calibrated" for understanding of geodynamic phenomena, which develop on space- and timescales that are very different from our everyday experience. We often have much better ideas about physical laws governing geodynamic processes than on how these laws should interact on geological space- and timescales. From this prospective, numerical models, in which these physical laws are self-consistently implemented, can gradually calibrate our intuition by exploring what scenarios are physically sensible and what are not. I personally went through this painful learning path many times and one noteworthy example was my 3D numerical modeling of oceanic transform faults. As I understand in retrospective, my initial literature-inspired concept of how and why transform faults form and evolve was thermomechanically inconsistent and based on two main assumptions (btw. both were incorrect!): (1) oceanic transforms are directly inherited from the continental rifting and breakup stages and (2) they represent plate fragmentation structures having peculiar extension-parallel orientation due to the stress rotation caused by thermal contraction of the oceanic lithosphere. During one year (!) of high-resolution thermomechanical numerical experiments exploring various physics (including very computationally demanding thermal contraction) I systematically observed how my initially prescribed extension-parallel weak transform faults connecting ridge segments rotated away from their original orientation and get converted into oblique ridge sections… This was really an epic failure! However, at the very same time, some pseudo-2D "side-models" with initial strait ridge and ad-hock strain weakened rheology, which were run for curiosity, suddenly showed spontaneous development of ridge curvature… Fraction of these models showed spontaneous development of orthogonal ridge-transform patterns by rotation of oblique ridge sections toward extension-parallel direction to accommodate asymmetric plate accretion. The later was controlled by detachment faults stabilized by strain weakening. Further exploration of these "side-models" resulted in complete changing of my concept for oceanic transforms: they are not plate fragmentation but rather plate growth structures stabilized by continuous plate accretion and rheological weakening of deforming rocks (Gerya, 2010, 2013). The conclusion is - keep shaking the tree and banana will fall… Gerya, T. (2010) Dynamical instability produces transform faults at mid-ocean ridges. Science, 329, 1047-1050. Gerya, T.V. (2013) Three-dimensional thermomechanical modeling of oceanic spreading initiation and evolution. Phys. Earth Planet. Interiors, 214, 35-52.
Comparison of CdZnTe neutron detector models using MCNP6 and Geant4
NASA Astrophysics Data System (ADS)
Wilson, Emma; Anderson, Mike; Prendergasty, David; Cheneler, David
2018-01-01
The production of accurate detector models is of high importance in the development and use of detectors. Initially, MCNP and Geant were developed to specialise in neutral particle models and accelerator models, respectively; there is now a greater overlap of the capabilities of both, and it is therefore useful to produce comparative models to evaluate detector characteristics. In a collaboration between Lancaster University, UK, and Innovative Physics Ltd., UK, models have been developed in both MCNP6 and Geant4 of Cadmium Zinc Telluride (CdZnTe) detectors developed by Innovative Physics Ltd. Herein, a comparison is made of the relative strengths of MCNP6 and Geant4 for modelling neutron flux and secondary γ-ray emission. Given the increasing overlap of the modelling capabilities of MCNP6 and Geant4, it is worthwhile to comment on differences in results for simulations which have similarities in terms of geometries and source configurations.
Repositioning the knee joint in human body FE models using a graphics-based technique.
Jani, Dhaval; Chawla, Anoop; Mukherjee, Sudipto; Goyal, Rahul; Vusirikala, Nataraju; Jayaraman, Suresh
2012-01-01
Human body finite element models (FE-HBMs) are available in standard occupant or pedestrian postures. There is a need to have FE-HBMs in the same posture as a crash victim or to be configured in varying postures. Developing FE models for all possible positions is not practically viable. The current work aims at obtaining a posture-specific human lower extremity model by reconfiguring an existing one. A graphics-based technique was developed to reposition the lower extremity of an FE-HBM by specifying the flexion-extension angle. Elements of the model were segregated into rigid (bones) and deformable components (soft tissues). The bones were rotated about the flexion-extension axis followed by rotation about the longitudinal axis to capture the twisting of the tibia. The desired knee joint movement was thus achieved. Geometric heuristics were then used to reposition the skin. A mapping defined over the space between bones and the skin was used to regenerate the soft tissues. Mesh smoothing was then done to augment mesh quality. The developed method permits control over the kinematics of the joint and maintains the initial mesh quality of the model. For some critical areas (in the joint vicinity) where element distortion is large, mesh smoothing is done to improve mesh quality. A method to reposition the knee joint of a human body FE model was developed. Repositions of a model from 9 degrees of flexion to 90 degrees of flexion in just a few seconds without subjective interventions was demonstrated. Because the mesh quality of the repositioned model was maintained to a predefined level (typically to the level of a well-made model in the initial configuration), the model was suitable for subsequent simulations.
Development and modeling of a more efficient frangible separation joint
NASA Astrophysics Data System (ADS)
Renfro, Steven L.; Harris, Gary N.; Olson, Steven L.
1993-06-01
A low-cost, robust, and contamination-free separation system for spacecraft or launch vehicle stage and fairing separation was developed, which includes a frangible joint to sever an aluminum extrusion and to control contamination. The installed joint uses a sealing manifold to provide redundant initiation transfer between Flexible Confined Detonating Cord assemblies and HNS-IA loaded cups on the ends of the HNS-IIA Mild Detonating Fuse. A shock matching model of the system was developed, and the margin of joint severance, contamination control of the system, and correlation of the model are demonstrated.
An examination of data quality on QSAR Modeling in regards ...
The development of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available to develop and validate QSAR models. We have focused our efforts on the widely used EPISuite software that was initially developed over two decades ago and, specifically, on the PHYSPROP dataset used to train the EPISuite prediction models. This presentation will review our approaches to examining key datasets, the delivery of curated data and the development of machine-learning models for thirteen separate property endpoints of interest to environmental science. We will also review how these data will be made freely accessible to the community via a new “chemistry dashboard”. This abstract does not reflect U.S. EPA policy. presentation at UNC-CH.
DOT National Transportation Integrated Search
2017-11-01
This report presents the results of a project to develop a truck vehicle/fuel decision choice model for California and to use that model to make initial projections of truck sales by technology out to 2050. The report also describes the linkage of th...
The effects of the canopy created velocity inflection in the wake development
NASA Astrophysics Data System (ADS)
Agafonova, O.; Avramenko, A.; Chaudhari, A.; Hellsten, A.
2016-06-01
The aim of this paper is to study the effects of forest on the turbine wakes. Initially, the ACL (actuator line) model as well as a Canopy model are validated with the experiments separately. The models are further applied to simulate the flow over two wind turbines in a row located within the forest.
Industry-Wide Workshop on Computational Turbulence Modeling
NASA Technical Reports Server (NTRS)
Shabbir, Aamir (Compiler)
1995-01-01
This publication contains the presentations made at the Industry-Wide Workshop on Computational Turbulence Modeling which took place on October 6-7, 1994. The purpose of the workshop was to initiate the transfer of technology developed at Lewis Research Center to industry and to discuss the current status and the future needs of turbulence models in industrial CFD.
Initial Comparison of Single Cylinder Stirling Engine Computer Model Predictions with Test Results
NASA Technical Reports Server (NTRS)
Tew, R. C., Jr.; Thieme, L. G.; Miao, D.
1979-01-01
A Stirling engine digital computer model developed at NASA Lewis Research Center was configured to predict the performance of the GPU-3 single-cylinder rhombic drive engine. Revisions to the basic equations and assumptions are discussed. Model predictions with the early results of the Lewis Research Center GPU-3 tests are compared.
A Memory Model of Depression: An Analysis of Cognition, Development and Emotion.
ERIC Educational Resources Information Center
Rehm, Lynn P.; Naus, Mary J.
In recent years a number of models of depression have been proposed. Many of them have incorporated cognitive constructs to explain vulnerability, initiation, maintenance, and recovery from depression. In light of the wealth of experimental and clinical knowledge about depression, these models can be seen as having a limited focus and scope.…
"Modeling" Youth Work: Logic Models, Neoliberalism, and Community Praxis
ERIC Educational Resources Information Center
Carpenter, Sara
2016-01-01
This paper examines the use of logic models in the development of community initiatives within the AmeriCorps program. AmeriCorps is the civilian national service programme in the U.S., operating as a grants programme to local governments and not-for-profit organisations and providing low-cost labour to address pressing issues of social…
NASA Instrument Cost/Schedule Model
NASA Technical Reports Server (NTRS)
Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George
2011-01-01
NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.
A Fast Technology Infusion Model for Aerospace Organizations
NASA Technical Reports Server (NTRS)
Shapiro, Andrew A.; Schone, Harald; Brinza, David E.; Garrett, Henry B.; Feather, Martin S.
2006-01-01
A multi-year Fast Technology Infusion initiative proposes a model for aerospace organizations to improve the cost-effectiveness by which they mature new, in-house developed software and hardware technologies for space mission use. The first year task under the umbrella of this initiative will provide the framework to demonstrate and document the fast infusion process. The viability of this approach will be demonstrated on two technologies developed in prior years with internal Jet Propulsion Laboratory (JPL) funding. One hardware technology and one software technology were selected for maturation within one calendar year or less. The overall objective is to achieve cost and time savings in the qualification of technologies. At the end of the recommended three-year effort, we will have demonstrated for six or more in-house developed technologies a clear path to insertion using a documented process that permits adaptation to a broad range of hardware and software projects.
Shim, Vickie B; Hunter, Peter J; Pivonka, Peter; Fernandez, Justin W
2011-12-01
The initiation of osteoarthritis (OA) has been linked to the onset and progression of pathologic mechanisms at the cartilage-bone interface. Most importantly, this degenerative disease involves cross-talk between the cartilage and subchondral bone environments, so an informative model should contain the complete complex. In order to evaluate this process, we have developed a multiscale model using the open-source ontologies developed for the Physiome Project with cartilage and bone descriptions at the cellular, micro, and macro levels. In this way, we can effectively model the influence of whole body loadings at the macro level and the influence of bone organization and architecture at the micro level, and have cell level processes that determine bone and cartilage remodeling. Cell information is then passed up the spatial scales to modify micro architecture and provide a macro spatial characterization of cartilage inflammation. We evaluate the framework by linking a common knee injury (anterior cruciate ligament deficiency) to proinflammatory mediators as a possible pathway to initiate OA. This framework provides a "virtual bone-cartilage" tool for evaluating hypotheses, treatment effects, and disease onset to inform and strengthen clinical studies.
Solar Occultation Retrieval Algorithm Development
NASA Technical Reports Server (NTRS)
Lumpe, Jerry D.
2004-01-01
This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.
Open Innovation at NASA: A New Business Model for Advancing Human Health and Performance Innovations
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.; Richard, Elizabeth E.; Keeton, Kathryn E.
2014-01-01
This paper describes a new business model for advancing NASA human health and performance innovations and demonstrates how open innovation shaped its development. A 45 percent research and technology development budget reduction drove formulation of a strategic plan grounded in collaboration. We describe the strategy execution, including adoption and results of open innovation initiatives, the challenges of cultural change, and the development of virtual centers and a knowledge management tool to educate and engage the workforce and promote cultural change.
Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly
The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less
Wells, Kirsty L.; Gaete, Marcia; Matalova, Eva; Deutsch, Danny; Rice, David; Tucker, Abigail S.
2013-01-01
Summary Salivary glands provide an excellent model for the study of epithelial–mesenchymal interactions. We have looked at the interactions involved in the early initiation and development of murine salivary glands using classic recombination experiments and knockout mice. We show that salivary gland epithelium, at thickening and initial bud stages, is able to direct salivary gland development in non-gland pharyngeal arch mesenchyme at early stages. The early salivary gland epithelium is therefore able to induce gland development in non-gland tissue. This ability later shifts to the mesenchyme, with non-gland epithelium, such as from the limb bud, able to form a branching gland when combined with pseudoglandular stage gland mesenchyme. This shift appears to involve Fgf signalling, with signals from the epithelium inducing Fgf10 in the mesenchyme. Fgf10 then signals back to the epithelium to direct gland down-growth and bud development. These experiments highlight the importance of epithelial–mesenchymal signalling in gland initiation, controlling where, when and how many salivary glands form. PMID:24167707
David R. Montgomery; Kevin M. Schmidt; William E. Dietrich; Jim McKean
2009-01-01
The middle of a hillslope hollow in the Oregon Coast Range failed and mobilized as a debris flow during heavy rainfall in November 1996. Automated pressure transducers recorded high spatial variability of pore water pressure within the area that mobilized as a debris flow, which initiated where local upward flow from bedrock developed into overlying colluvium....
ERIC Educational Resources Information Center
Barahona, Malba
2017-01-01
The demonstrable potential of team teaching as a productive mechanism for developing collaborative teacher learning is now broadly understood in the field of teacher education. However, there is less evidence of the use of such collaborative teaching as a means of strengthening initial foreign/second language teacher education. This paper reports…
ERIC Educational Resources Information Center
Leggatt, Simon
2016-01-01
This case study describes the development process of a model using readily-available technology to facilitate collaboration, moderation and the dissemination of best practice in initial teacher training in the UK. Students, mentors, tutors and external examiners from a number of educational institutions in a UK, higher education-led Lifelong…
Junior Army Officer Retention Intentions: A Path Analytic Model
1991-07-01
theoretically useful only if they explain behavior that cannot be predicted within traditional expectancy and equity based motivational models. Scholl (1981), in...argue that long-tenured employees need to justify their behavioral commitment to the organization. They do this by developing more positive attitudes...good" to "very poor" scale to rate opportunities for intrinsic work satisfaction (learn/ develop skills, do interesting work, exercise initiative) in
A differential CDM model for fatigue of unidirectional metal matrix composites
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Kruch, S.
1992-01-01
A multiaxial, isothermal, continuum damage mechanics (CDM) model for fatigue of a unidirectional metal matrix composite volume element is presented. The model is phenomenological, stress based, and assumes a single scalar internal damage variable, the evolution of which is anisotropic. The development of the fatigue damage model, (i.e., evolutionary law) is based on the definition of an initially transversely isotropic fatigue limit surface, a static fracture surface, and a normalized stress amplitude function. The anisotropy of these surfaces and function, and therefore the model, is defined through physically meaningful invariants reflecting the local stress and material orientation. This transversely isotropic model is shown, when taken to it's isotropic limit, to directly simplify to a previously developed and validated isotropic fatigue continuum damage model. Results of a nondimensional parametric study illustrate (1) the flexibility of the present formulation in attempting to characterize a class of composite materials, and (2) the capability of the formulation in predicting anticipated qualitative trends in the fatigue behavior of unidirectional metal matrix composites. Also, specific material parameters representing an initial characterization of the composite system SiC/Ti 15-3 and the matrix material (Ti 15-3) are reported.
A socio-technical model to explore urban water systems scenarios.
de Haan, Fjalar J; Ferguson, Briony C; Deletic, Ana; Brown, Rebekah R
2013-01-01
This article reports on the ongoing work and research involved in the development of a socio-technical model of urban water systems. Socio-technical means the model is not so much concerned with the technical or biophysical aspects of urban water systems, but rather with the social and institutional implications of the urban water infrastructure and vice versa. A socio-technical model, in the view purported in this article, produces scenarios of different urban water servicing solutions gaining or losing influence in meeting water-related societal needs, like potable water, drainage, environmental health and amenity. The urban water system is parameterised with vectors of the relative influence of each servicing solution. The model is a software implementation of the Multi-Pattern Approach, a theory on societal systems, like urban water systems, and how these develop and go through transitions under various internal and external conditions. Acknowledging that social dynamics comes with severe and non-reducible uncertainties, the model is set up to be exploratory, meaning that for any initial condition several possible future scenarios are produced. This article gives a concise overview of the necessary theoretical background, the model architecture and some initial test results using a drainage example.
How do horizontal, frictional discontinuities affect reverse fault-propagation folding?
NASA Astrophysics Data System (ADS)
Bonanno, Emanuele; Bonini, Lorenzo; Basili, Roberto; Toscani, Giovanni; Seno, Silvio
2017-09-01
The development of new reverse faults and related folds is strongly controlled by the mechanical characteristics of the host rocks. In this study we analyze the impact of a specific kind of anisotropy, i.e. thin mechanical and frictional discontinuities, in affecting the development of reverse faults and of the associated folds using physical scaled models. We perform analog modeling introducing one or two initially horizontal, thin discontinuities above an initially blind fault dipping at 30° in one case, and 45° in another, and then compare the results with those obtained from a fully isotropic model. The experimental results show that the occurrence of thin discontinuities affects both the development and the propagation of new faults and the shape of the associated folds. New faults 1) accelerate or decelerate their propagation depending on the location of the tips with respect to the discontinuities, 2) cross the discontinuities at a characteristic angle (∼90°), and 3) produce folds with different shapes, resulting not only from the dip of the new faults but also from their non-linear propagation history. Our results may have direct impact on future kinematic models, especially those aimed to reconstruct the tectonic history of faults that developed in layered rocks or in regions affected by pre-existing faults.
Holden, D J; Moore, K S; Holliday, J L
1998-06-01
This study investigates the development and implementation of health education strategies at the local level for a statewide breast and cervical cancer control program. Baseline data on these initiatives were collected from 88 local screening programs in North Carolina. Using the ecological model as a framework, health education initiatives were assessed and analyzed to determine the level of activity occurring at the local level and the comprehensiveness of programs. Types and levels of interventions used are described and initial analysis is provided of the impact these strategies are having on recruiting women from target populations into these screening programs. Specific examples illustrating the variety of interventions used at the individual, network, organizational and community levels, and the impact of certain variables, such as the use of local health education staff, on the comprehensiveness of interventions utilized, are provided. The importance to practitioners of establishing process indicators in assessing local initiatives and challenges to conducting evaluations of these strategies are also discussed.
Ecological sites: A useful tool for land management
Alicia N. Struckhoff; Douglas Wallace; Fred Young
2017-01-01
Developing ecological sites in Missouri is a multiagency, multidiscipline effort led by the Missouri Department of Conservation and the U.S. Department of Agriculture (USDA) Natural Resources Conservation Service. The methodology developed in Missouri has recently served as a model for ecological site development across the country and has aided in an initiative to...
Grant Proposal Development a la FLC (Faculty Learning Community) Mode
ERIC Educational Resources Information Center
Frantz, Pollyanne S.
2013-01-01
Although the Faculty Learning Community is not a new structure or initiative in the higher education arena, adapting this model for faculty development focused on grant proposal writing is relatively new. This article describes how the concept developed by Milt Cox of Miami University has been successfully modified and implemented twice on the…
Friendship Quality Scale: Conceptualization, Development and Validation
ERIC Educational Resources Information Center
Thien, Lei Mee; Razak, Nordin Abd; Jamil, Hazri
2012-01-01
The purpose of this study is twofold: (1) to initialize a new conceptualization of positive feature based Friendship Quality (FQUA) scale on the basis of four dimensions: Closeness, Help, Acceptance, and Safety; and (2) to develop and validate FQUA scale in the form of reflective measurement model. The scale development and validation procedures…
Life prediction and constitutive models for engine hot section anisotropic materials program
NASA Technical Reports Server (NTRS)
Nissley, D. M.; Meyer, T. G.
1992-01-01
This report presents the results from a 35 month period of a program designed to develop generic constitutive and life prediction approaches and models for nickel-based single crystal gas turbine airfoils. The program is composed of a base program and an optional program. The base program addresses the high temperature coated single crystal regime above the airfoil root platform. The optional program investigates the low temperature uncoated single crystal regime below the airfoil root platform including the notched conditions of the airfoil attachment. Both base and option programs involve experimental and analytical efforts. Results from uniaxial constitutive and fatigue life experiments of coated and uncoated PWA 1480 single crystal material form the basis for the analytical modeling effort. Four single crystal primary orientations were used in the experiments: (001), (011), (111), and (213). Specific secondary orientations were also selected for the notched experiments in the optional program. Constitutive models for an overlay coating and PWA 1480 single crystal material were developed based on isothermal hysteresis loop data and verified using thermomechanical (TMF) hysteresis loop data. A fatigue life approach and life models were selected for TMF crack initiation of coated PWA 1480. An initial life model used to correlate smooth and notched fatigue data obtained in the option program shows promise. Computer software incorporating the overlay coating and PWA 1480 constitutive models was developed.
Spin-up simulation behaviors in a climate model to build a basement of long-time simulation
NASA Astrophysics Data System (ADS)
Lee, J.; Xue, Y.; De Sales, F.
2015-12-01
It is essential to develop start-up information when conducting long-time climate simulation. In case that the initial condition is already available from the previous simulation of same type model this does not necessary; however, if not, model needs spin-up simulation to have adjusted and balanced initial condition with the model climatology. Otherwise, a severe spin may take several years. Some of model variables such as deep soil temperature fields and temperature in ocean deep layers in initial fields would affect model's further long-time simulation due to their long residual memories. To investigate the important factor for spin-up simulation in producing an atmospheric initial condition, we had conducted two different spin-up simulations when no atmospheric condition is available from exist datasets. One simulation employed atmospheric global circulation model (AGCM), namely Global Forecast System (GFS) of National Center for Environmental Prediction (NCEP), while the other employed atmosphere-ocean coupled global circulation model (CGCM), namely Climate Forecast System (CFS) of NCEP. Both models share the atmospheric modeling part and only difference is in applying of ocean model coupling, which is conducted by Modular Ocean Model version 4 (MOM4) of Geophysical Fluid Dynamics Laboratory (GFDL) in CFS. During a decade of spin-up simulation, prescribed sea-surface temperature (SST) fields of target year is forced to the GFS daily basis, while CFS digested only first time step ocean condition and freely iterated for the rest of the period. Both models were forced by CO2 condition and solar constant given from the target year. Our analyses of spin-up simulation results indicate that freely conducted interaction between the ocean and the atmosphere is more helpful to produce the initial condition for the target year rather than produced by fixed SST forcing. Since the GFS used prescribed forcing exactly given from the target year, this result is unexpected. The detail analysis will be discussed in this presentation.
NASA Technical Reports Server (NTRS)
Turon, Albert; Camanho, Pedro P.; Costa, Josep; Davila, Carlos G.
2004-01-01
A thermodynamically consistent damage model for the simulation of progressive delamination under variable mode ratio is presented. The model is formulated in the context of the Damage Mechanics (DM). The constitutive equations that result from the variation of the free energy with damage are used to model the initiation and propagation of delamination. A new delamination initiation criterion is developed to assure that the formulation can account for changes in the loading mode in a thermodynamically consistent way. Interfacial penetration of two adjacent layers after complete decohesion is prevented by the formulation of the free energy. The model is implemented into the commercial finite element code ABAQUS by means of a user-written decohesion element. Finally, the numerical predictions given by the model are compared with experimental results.
Logistic Mixed Models to Investigate Implicit and Explicit Belief Tracking.
Lages, Martin; Scheel, Anne
2016-01-01
We investigated the proposition of a two-systems Theory of Mind in adults' belief tracking. A sample of N = 45 participants predicted the choice of one of two opponent players after observing several rounds in an animated card game. Three matches of this card game were played and initial gaze direction on target and subsequent choice predictions were recorded for each belief task and participant. We conducted logistic regressions with mixed effects on the binary data and developed Bayesian logistic mixed models to infer implicit and explicit mentalizing in true belief and false belief tasks. Although logistic regressions with mixed effects predicted the data well a Bayesian logistic mixed model with latent task- and subject-specific parameters gave a better account of the data. As expected explicit choice predictions suggested a clear understanding of true and false beliefs (TB/FB). Surprisingly, however, model parameters for initial gaze direction also indicated belief tracking. We discuss why task-specific parameters for initial gaze directions are different from choice predictions yet reflect second-order perspective taking.
NASA Technical Reports Server (NTRS)
Gamayunov, K. V.; Khazanov, G. V.; Liemohn, M. W.; Fok, M.-C.; Ridley, A. J.
2009-01-01
Further development of our self-consistent model of interacting ring current (RC) ions and electromagnetic ion cyclotron (EMIC) waves is presented. This model incorporates large scale magnetosphere-ionosphere coupling and treats self-consistently not only EMIC waves and RC ions, but also the magnetospheric electric field, RC, and plasmasphere. Initial simulations indicate that the region beyond geostationary orbit should be included in the simulation of the magnetosphere-ionosphere coupling. Additionally, a self-consistent description, based on first principles, of the ionospheric conductance is required. These initial simulations further show that in order to model the EMIC wave distribution and wave spectral properties accurately, the plasmasphere should also be simulated self-consistently, since its fine structure requires as much care as that of the RC. Finally, an effect of the finite time needed to reestablish a new potential pattern throughout the ionosphere and to communicate between the ionosphere and the equatorial magnetosphere cannot be ignored.
NASA pyrotechnically actuated systems program
NASA Technical Reports Server (NTRS)
Schulze, Norman R.
1993-01-01
The Office of Safety and Mission Quality initiated a Pyrotechnically Actuated Systems (PAS) Program in FY-92 to address problems experienced with pyrotechnically actuated systems and devices used both on the ground and in flight. The PAS Program will provide the technical basis for NASA's projects to incorporate new technological developments in operational systems. The program will accomplish that objective by developing/testing current and new hardware designs for flight applications and by providing a pyrotechnic data base. This marks the first applied pyrotechnic technology program funded by NASA to address pyrotechnic issues. The PAS Program has been structured to address the results of a survey of pyrotechnic device and system problems with the goal of alleviating or minimizing their risks. Major program initiatives include the development of a Laser Initiated Ordnance System, a pyrotechnic systems data base, NASA Standard Initiator model, a NASA Standard Linear Separation System and a NASA Standard Gas Generator. The PAS Program sponsors annual aerospace pyrotechnic systems workshops.
A common evaluation framework for the African Health Initiative.
Bryce, Jennifer; Requejo, Jennifer Harris; Moulton, Lawrence H; Ram, Malathi; Black, Robert E
2013-01-01
The African Health Initiative includes highly diverse partnerships in five countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia), each of which is working to improve population health by strengthening health systems and to evaluate the results. One aim of the Initiative is to generate cross-site learning that can inform implementation in the five partnerships during the project period and identify lessons that may be generalizable to other countries in the region. Collaborators in the Initiative developed a common evaluation framework as a basis for this cross-site learning. This paper describes the components of the framework; this includes the conceptual model, core metrics to be measured in all sites, and standard guidelines for reporting on the implementation of partnership activities and contextual factors that may affect implementation, or the results it produces. We also describe the systems that have been put in place for data management, data quality assessments, and cross-site analysis of results. The conceptual model for the Initiative highlights points in the causal chain between health system strengthening activities and health impact where evidence produced by the partnerships can contribute to learning. This model represents an important advance over its predecessors by including contextual factors and implementation strength as potential determinants, and explicitly including equity as a component of both outcomes and impact. Specific measurement challenges include the prospective documentation of program implementation and contextual factors. Methodological issues addressed in the development of the framework include the aggregation of data collected using different methods and the challenge of evaluating a complex set of interventions being improved over time based on continuous monitoring and intermediate results.
Teacher Leader Model Standards: Implications for Preparation, Policy, and Practice
ERIC Educational Resources Information Center
Berg, Jill Harrison; Carver, Cynthia L.; Mangin, Melinda M.
2014-01-01
Teacher leadership is increasingly recognized as a resource for instructional improvement. Consequently, teacher leader initiatives have expanded rapidly despite limited knowledge about how to prepare and support teacher leaders. In this context, the "Teacher Leader Model Standards" represent an important development in the field. In…
Extended Relation Metadata for SCORM-Based Learning Content Management Systems
ERIC Educational Resources Information Center
Lu, Eric Jui-Lin; Horng, Gwoboa; Yu, Chia-Ssu; Chou, Ling-Ying
2010-01-01
To increase the interoperability and reusability of learning objects, Advanced Distributed Learning Initiative developed a model called Content Aggregation Model (CAM) to describe learning objects and express relationships between learning objects. However, the suggested relations defined in the CAM can only describe structure-oriented…
Emergence of Alpha and Gamma Like Rhythms in a Large Scale Simulation of Interacting Neurons
NASA Astrophysics Data System (ADS)
Gaebler, Philipp; Miller, Bruce
2007-10-01
In the normal brain, at first glance the electrical activity appears very random. However, certain frequencies emerge during specific stages of sleep or between quiet wake states. This raises the question of whether current mathematical and computational models of interacting neurons can display similar behavior. A recent model developed by Eugene Izhikevich appears to succeed. However, early dynamical simulations used to detect these patterns were possibly compromised by an over-simplified initial condition and evolution algorithm. Utilizing the same model, but a more robust algorithm, here we present our initial results, showing that these patterns persist under a wide range of initial conditions. We employ spectral analysis of the firing patterns of a system of interacting excitatory and inhibitory neurons to demonstrate a bimodal spectrum centered on two frequencies in the range characteristic of alpha and gamma rhythms in the human brain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hai Huang; Ben Spencer; Jason Hales
2014-10-01
A discrete element Model (DEM) representation of coupled solid mechanics/fracturing and heat conduction processes has been developed and applied to explicitly simulate the random initiations and subsequent propagations of interacting thermal cracks in a ceramic nuclear fuel pellet during initial rise to power and during power cycles. The DEM model clearly predicts realistic early-life crack patterns including both radial cracks and circumferential cracks. Simulation results clearly demonstrate the formation of radial cracks during the initial power rise, and formation of circumferential cracks as the power is ramped down. In these simulations, additional early-life power cycles do not lead to themore » formation of new thermal cracks. They do, however clearly indicate changes in the apertures of thermal cracks during later power cycles due to thermal expansion and shrinkage. The number of radial cracks increases with increasing power, which is consistent with the experimental observations.« less
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
On the Initiation Mechanism in Exploding Bridgewire and Laser Detonators
NASA Astrophysics Data System (ADS)
Stewart, D. Scott; Thomas, K.; Saenz, J.
2005-07-01
Since its invention by Los Alamos during the Manhattan Project era the exploding bridgewire detonator (EBW) has seen tremendous use and study. Recent development of a laser-powered device with detonation properties similar to an EBW is reviving interest in the basic physics of the Deflagration-to-Detonation (DDT) process in both of these devices,[1]. Cutback experiments using both laser interferometry and streak camera observations are providing new insight into the initiation mechanism in EBWs. These measurements are being correlated to a DDT model of compaction to detonation and shock to detonation developed previously by Xu and Stewart, [2]. The DDT model is incorporated into a high-resolution, multi-material model code for simulating the complete process. Model formulation and predictions against the test data will be discussed. REFS. [1] A. Munger, J. Kennedy, A. Akinci, and K. Thomas, "Dev. of a Laser Detonator" 30th Int. Pyrotechnics Seminar, Fort Collins, CO, (2004). [2] Xu, S. and Stewart, D. S. Deflagration to detonation transition in porous energetic materials: A model study. J. Eng. Math., 31, 143-172 (1997)