Sample records for process modeling efforts

  1. Effortful versus automatic emotional processing in schizophrenia: Insights from a face-vignette task.

    PubMed

    Patrick, Regan E; Rastogi, Anuj; Christensen, Bruce K

    2015-01-01

    Adaptive emotional responding relies on dual automatic and effortful processing streams. Dual-stream models of schizophrenia (SCZ) posit a selective deficit in neural circuits that govern goal-directed, effortful processes versus reactive, automatic processes. This imbalance suggests that when patients are confronted with competing automatic and effortful emotional response cues, they will exhibit diminished effortful responding and intact, possibly elevated, automatic responding compared to controls. This prediction was evaluated using a modified version of the face-vignette task (FVT). Participants viewed emotional faces (automatic response cue) paired with vignettes (effortful response cue) that signalled a different emotion category and were instructed to discriminate the manifest emotion. Patients made less vignette and more face responses than controls. However, the relationship between group and FVT responding was moderated by IQ and reading comprehension ability. These results replicate and extend previous research and provide tentative support for abnormal conflict resolution between automatic and effortful emotional processing predicted by dual-stream models of SCZ.

  2. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  3. Exploring Students' Reflective Thinking Practice, Deep Processing Strategies, Effort, and Achievement Goal Orientations

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2009-01-01

    Recent research indicates that study processing strategies, effort, reflective thinking practice, and achievement goals are important factors contributing to the prediction of students' academic success. Very few studies have combined these theoretical orientations within one conceptual model. This study tested a conceptual model that included, in…

  4. Predictive Models for Semiconductor Device Design and Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1998-01-01

    The device feature size continues to be on a downward trend with a simultaneous upward trend in wafer size to 300 mm. Predictive models are needed more than ever before for this reason. At NASA Ames, a Device and Process Modeling effort has been initiated recently with a view to address these issues. Our activities cover sub-micron device physics, process and equipment modeling, computational chemistry and material science. This talk would outline these efforts and emphasize the interaction among various components. The device physics component is largely based on integrating quantum effects into device simulators. We have two parallel efforts, one based on a quantum mechanics approach and the second, a semiclassical hydrodynamics approach with quantum correction terms. Under the first approach, three different quantum simulators are being developed and compared: a nonequlibrium Green's function (NEGF) approach, Wigner function approach, and a density matrix approach. In this talk, results using various codes will be presented. Our process modeling work focuses primarily on epitaxy and etching using first-principles models coupling reactor level and wafer level features. For the latter, we are using a novel approach based on Level Set theory. Sample results from this effort will also be presented.

  5. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  6. The politics of participation in watershed modeling.

    PubMed

    Korfmacher, K S

    2001-02-01

    While researchers and decision-makers increasingly recognize the importance of public participation in environmental decision-making, there is less agreement about how to involve the public. One of the most controversial issues is how to involve citizens in producing scientific information. Although this question is relevant to many areas of environmental policy, it has come to the fore in watershed management. Increasingly, the public is becoming involved in the sophisticated computer modeling efforts that have been developed to inform watershed management decisions. These models typically have been treated as technical inputs to the policy process. However, model-building itself involves numerous assumptions, judgments, and decisions that are relevant to the public. This paper examines the politics of public involvement in watershed modeling efforts and proposes five guidelines for good practice for such efforts. Using these guidelines, I analyze four cases in which different approaches to public involvement in the modeling process have been attempted and make recommendations for future efforts to involve communities in watershed modeling. Copyright 2001 Springer-Verlag

  7. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  8. Numerical Modeling of Transport of Biomass Burning Emissions on South America

    NASA Technical Reports Server (NTRS)

    RibeirodeFreitas, Saulo

    2001-01-01

    Our research efforts have addressed theoretical and numerical modeling of sources emissions and transport processes of trace gases and aerosols emitted by biomass burning on the central of Brazil and Amazon basin. For this effort we coupled all Eulerian transport model with the mesoscale atmospheric model RAMS (Regional Atmospheric Modeling System).

  9. Parent Management Training-Oregon Model (PMTO™) in Mexico City: Integrating Cultural Adaptation Activities in an Implementation Model

    PubMed Central

    Baumann, Ana A.; Domenech Rodríguez, Melanie M.; Amador, Nancy G.; Forgatch, Marion S.; Parra-Cardona, J. Rubén

    2015-01-01

    This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States. PMID:26052184

  10. Parent Management Training-Oregon Model (PMTO™) in Mexico City: Integrating Cultural Adaptation Activities in an Implementation Model.

    PubMed

    Baumann, Ana A; Domenech Rodríguez, Melanie M; Amador, Nancy G; Forgatch, Marion S; Parra-Cardona, J Rubén

    2014-03-01

    This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States.

  11. An Example for Integrated Gas Turbine Engine Testing and Analysis Using Modeling and Simulation

    DTIC Science & Technology

    2006-12-01

    USAF Academy in a joint test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and...test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and demonstrate the approach...test and analysis effort of the F109 turbofan engine, an effort which uses a swirl investigation as a vehicle to exercise and demonstrate the

  12. Choosing the Optimum Mix of Duration and Effort in Education.

    ERIC Educational Resources Information Center

    Oosterbeek, Hessel

    1995-01-01

    Employs a simple economic model to analyze determinants of Dutch college students' expected study duration and weekly effort. Findings show that the duration/effort ratio is determined by the relative prices of these inputs into the learning process. A higher socioeconomic status increases the duration/effort ratio. Higher ability levels decrease…

  13. Jupiter Europa Orbiter Architecture Definition Process

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  14. A conceptual model to empower software requirements conflict detection and resolution with rule-based reasoning

    NASA Astrophysics Data System (ADS)

    Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed

    2016-08-01

    It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.

  15. Artificial intelligence support for scientific model-building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  16. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  17. Speech Perception as a Cognitive Process: The Interactive Activation Model.

    ERIC Educational Resources Information Center

    Elman, Jeffrey L.; McClelland, James L.

    Research efforts to model speech perception in terms of a processing system in which knowledge and processing are distributed over large numbers of highly interactive--but computationally primative--elements are described in this report. After discussing the properties of speech that demand a parallel interactive processing system, the report…

  18. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  19. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.

  20. The ABLe change framework: a conceptual and methodological tool for promoting systems change.

    PubMed

    Foster-Fishman, Pennie G; Watson, Erin R

    2012-06-01

    This paper presents a new approach to the design and implementation of community change efforts like a System of Care. Called the ABLe Change Framework, the model provides simultaneous attention to the content and process of the work, ensuring effective implementation and the pursuit of systems change. Three key strategies are employed in this model to ensure the integration of content and process efforts and effective mobilization of broad scale systems change: Systemic Action Learning Teams, Simple Rules, and Small Wins. In this paper we describe the ABLe Change Framework and present a case study in which we successfully applied this approach to one system of care effort in Michigan.

  1. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  2. Integrative modelling for One Health: pattern, process and participation

    PubMed Central

    Redding, D. W.; Wood, J. L. N.

    2017-01-01

    This paper argues for an integrative modelling approach for understanding zoonoses disease dynamics, combining process, pattern and participatory models. Each type of modelling provides important insights, but all are limited. Combining these in a ‘3P’ approach offers the opportunity for a productive conversation between modelling efforts, contributing to a ‘One Health’ agenda. The aim is not to come up with a composite model, but seek synergies between perspectives, encouraging cross-disciplinary interactions. We illustrate our argument with cases from Africa, and in particular from our work on Ebola virus and Lassa fever virus. Combining process-based compartmental models with macroecological data offers a spatial perspective on potential disease impacts. However, without insights from the ground, the ‘black box’ of transmission dynamics, so crucial to model assumptions, may not be fully understood. We show how participatory modelling and ethnographic research of Ebola and Lassa fever can reveal social roles, unsafe practices, mobility and movement and temporal changes in livelihoods. Together with longer-term dynamics of change in societies and ecologies, all can be important in explaining disease transmission, and provide important complementary insights to other modelling efforts. An integrative modelling approach therefore can offer help to improve disease control efforts and public health responses. This article is part of the themed issue ‘One Health for a changing world: zoonoses, ecosystems and human well-being’. PMID:28584172

  3. Complex network models reveal correlations among network metrics, exercise intensity and role of body changes in the fatigue process

    PubMed Central

    Pereira, Vanessa Helena; Gama, Maria Carolina Traina; Sousa, Filipe Antônio Barros; Lewis, Theodore Gyle; Gobatto, Claudio Alexandre; Manchado - Gobatto, Fúlvia Barros

    2015-01-01

    The aims of the present study were analyze the fatigue process at distinct intensity efforts and to investigate its occurrence as interactions at distinct body changes during exercise, using complex network models. For this, participants were submitted to four different running intensities until exhaustion, accomplished in a non-motorized treadmill using a tethered system. The intensities were selected according to critical power model. Mechanical (force, peak power, mean power, velocity and work) and physiological related parameters (heart rate, blood lactate, time until peak blood lactate concentration (lactate time), lean mass, anaerobic and aerobic capacities) and IPAQ score were obtained during exercises and it was used to construction of four complex network models. Such models have both, theoretical and mathematical value, and enables us to perceive new insights that go beyond conventional analysis. From these, we ranked the influences of each node at the fatigue process. Our results shows that nodes, links and network metrics are sensibility according to increase of efforts intensities, been the velocity a key factor to exercise maintenance at models/intensities 1 and 2 (higher time efforts) and force and power at models 3 and 4, highlighting mechanical variables in the exhaustion occurrence and even training prescription applications. PMID:25994386

  4. Complex network models reveal correlations among network metrics, exercise intensity and role of body changes in the fatigue process

    NASA Astrophysics Data System (ADS)

    Pereira, Vanessa Helena; Gama, Maria Carolina Traina; Sousa, Filipe Antônio Barros; Lewis, Theodore Gyle; Gobatto, Claudio Alexandre; Manchado-Gobatto, Fúlvia Barros

    2015-05-01

    The aims of the present study were analyze the fatigue process at distinct intensity efforts and to investigate its occurrence as interactions at distinct body changes during exercise, using complex network models. For this, participants were submitted to four different running intensities until exhaustion, accomplished in a non-motorized treadmill using a tethered system. The intensities were selected according to critical power model. Mechanical (force, peak power, mean power, velocity and work) and physiological related parameters (heart rate, blood lactate, time until peak blood lactate concentration (lactate time), lean mass, anaerobic and aerobic capacities) and IPAQ score were obtained during exercises and it was used to construction of four complex network models. Such models have both, theoretical and mathematical value, and enables us to perceive new insights that go beyond conventional analysis. From these, we ranked the influences of each node at the fatigue process. Our results shows that nodes, links and network metrics are sensibility according to increase of efforts intensities, been the velocity a key factor to exercise maintenance at models/intensities 1 and 2 (higher time efforts) and force and power at models 3 and 4, highlighting mechanical variables in the exhaustion occurrence and even training prescription applications.

  5. Pathways to Problem Behaviors: Chaotic Homes, Parent and Child Effortful Control, and Parenting

    ERIC Educational Resources Information Center

    Valiente, Carlos; Lemery-Chalfant, Kathryn; Reiser, Mark

    2007-01-01

    Guided by Belsky's and Eisenberg, Cumberland, and Spinrad's heuristic models, we tested a process model with hypothesized paths from parents' effortful control (EC) and family chaos to indices of parenting to children's EC, and finally children's externalizing problem behavior. Parents reported on all constructs and children (N = 188; M age = 9.55…

  6. Examination of Bond Properties through Infrared Spectroscopy and Molecular Modeling in the General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Csizmar, Clifford M.; Force, Dee Ann; Warner, Don L.

    2012-01-01

    A concerted effort has been made to increase the opportunities for undergraduate students to address scientific problems employing the processes used by practicing chemists. As part of this effort, an infrared (IR) spectroscopy and molecular modeling experiment was developed for the first-year general chemistry laboratory course. In the…

  7. Hindsight Bias Doesn't Always Come Easy: Causal Models, Cognitive Effort, and Creeping Determinism

    ERIC Educational Resources Information Center

    Nestler, Steffen; Blank, Hartmut; von Collani, Gernot

    2008-01-01

    Creeping determinism, a form of hindsight bias, refers to people's hindsight perceptions of events as being determined or inevitable. This article proposes, on the basis of a causal-model theory of creeping determinism, that the underlying processes are effortful, and hence creeping determinism should disappear when individuals lack the cognitive…

  8. Overview of the United States Department of Energy's ARM (Atmospheric Radiation Measurement) Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stokes, G.M.; Tichler, J.L.

    The Department of Energy (DOE) is initiating a major atmospheric research effort, the Atmospheric Radiation Measurement Program (ARM). The program is a key component of DOE's research strategy to address global climate change and is a direct continuation of DOE's decade-long effort to improve the ability of General Circulation Models (GCMs) to provide reliable simulations of regional, and long-term climate change in response to increasing greenhouse gases. The effort is multi-disciplinary and multi-agency, involving universities, private research organizations and more than a dozen government laboratories. The objective of the ARM Research is to provide an experimental testbed for the studymore » of important atmospheric effects, particularly cloud and radiative processes, and to test parameterizations of these processes for use in atmospheric models. This effort will support the continued and rapid improvement of GCM predictive capability. 2 refs.« less

  9. Establishing a Framework for Community Modeling in Hydrologic Science: Recommendations from the CUAHSI CHyMP Initiative

    NASA Astrophysics Data System (ADS)

    Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.

    2012-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.

  10. Examination of a Process Model of Adolescent Smoking Self-Change Efforts in Relation to Gender

    ERIC Educational Resources Information Center

    MacPherson, Laura; Myers, Mark G.

    2010-01-01

    Little information describes how adolescents change their smoking behavior. This study investigated the role of gender in the relationship of motivation and cognitive variables with adolescent smoking self-change efforts. Self-report and semi-structured interview data from a prospective study of smoking self-change efforts were examined among 98…

  11. High level cognitive information processing in neural networks

    NASA Technical Reports Server (NTRS)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  12. Potential Natural Vegetation of the Mississippi Alluvial Valley: Boeuf-Tensas Basin, Arkansas, Field Atlas

    DTIC Science & Technology

    2012-09-01

    under the auspices of federal and state research programs or in conjunction with Corps of Engineers project planning efforts. In the process , a...in the field effort and assembled and processed the original project GIS data. Malcolm Williamson (Center for Advanced Spatial Technologies...further improve drainage. ERDC/EL TR-12-28 5 3 Using the PNV map as a model for restoration The PNV mapping process was conceived as a way to

  13. Goal striving, goal attainment, and well-being: adapting and testing the self-concordance model in sport.

    PubMed

    Smith, Alison; Ntoumanis, Nikos; Duda, Joan

    2007-12-01

    Grounded in self-determination theory (Deci & Ryan, 1985) and the self-concordance model (Sheldon & Elliot, 1999), this study examined the motivational processes underlying goal striving in sport as well as the role of perceived coach autonomy support in the goal process. Structural equation modeling with a sample of 210 British athletes showed that autonomous goal motives positively predicted effort, which, in turn, predicted goal attainment. Goal attainment was positively linked to need satisfaction, which, in turn, predicted psychological well-being. Effort and need satisfaction were found to mediate the associations between autonomous motives and goal attainment and between attainment and well-being, respectively. Controlled motives negatively predicted well-being, and coach autonomy support positively predicted both autonomous motives and need satisfaction. Associations of autonomous motives with effort were not reducible to goal difficulty, goal specificity, or goal efficacy. These findings support the self-concordance model as a framework for further research on goal setting in sport.

  14. Java PathFinder: A Translator From Java to Promela

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    1999-01-01

    JAVA PATHFINDER, JPF, is a prototype translator from JAVA to PROMELA, the modeling language of the SPIN model checker. JPF is a product of a major effort by the Automated Software Engineering group at NASA Ames to make model checking technology part of the software process. Experience has shown that severe bugs can be found in final code using this technique, and that automated translation from a programming language to a modeling language like PROMELA can help reducing the effort required.

  15. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  16. History of research on modelling gypsy moth population ecology

    Treesearch

    J. J. Colbert

    1991-01-01

    History of research to develop models of gypsy moth population dynamics and some related studies are described. Empirical regression-based models are reviewed, and then the more comprehensive process models are discussed. Current model- related research efforts are introduced.

  17. Countering Extremism: An Understanding of the Problem, the Process and Some Solutions

    DTIC Science & Technology

    2015-06-01

    Radicalization into Terrorism.” 20 extremist movement “could not be developed , evolved or sustained over time and place” without the support of...efforts to research, develop , and implement a variety of methods and models to prevent and counter violent extremism (CVE). Despite these efforts...but how to better understand the radicalization process in order to develop effective strategies to prevent radicalization from occurring. Faced

  18. Animated-simulation modeling facilitates clinical-process costing.

    PubMed

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  19. Mathematical modeling and SAR simulation multifunction SAR technology efforts

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    The orbital SAR (synthetic aperture radar) simulation data was used in several simulation efforts directed toward advanced SAR development. Efforts toward simulating an operational radar, simulation of antenna polarization effects, and simulation of SAR images at serveral different wavelengths are discussed. Avenues for improvements in the orbital SAR simulation and its application to the development of advanced digital radar data processing schemes are indicated.

  20. Implicit cognitive aggression among young male prisoners: Association with dispositional and current aggression.

    PubMed

    Ireland, Jane L; Adams, Christine

    2015-01-01

    The current study explores associations between implicit and explicit aggression in young adult male prisoners, seeking to apply the Reflection-Impulsive Model and indicate parity with elements of the General Aggression Model and social cognition. Implicit cognitive aggressive processing is not an area that has been examined among prisoners. Two hundred and sixty two prisoners completed an implicit cognitive aggression measure (Puzzle Test) and explicit aggression measures, covering current behaviour (DIPC-R) and aggression disposition (AQ). It was predicted that dispositional aggression would be predicted by implicit cognitive aggression, and that implicit cognitive aggression would predict current engagement in aggressive behaviour. It was also predicted that more impulsive implicit cognitive processing would associate with aggressive behaviour whereas cognitively effortful implicit cognitive processing would not. Implicit aggressive cognitive processing was associated with increased dispositional aggression but not current reports of aggressive behaviour. Impulsive implicit cognitive processing of an aggressive nature predicted increased dispositional aggression whereas more cognitively effortful implicit cognitive aggression did not. The article concludes by outlining the importance of accounting for implicit cognitive processing among prisoners and the need to separate such processing into facets (i.e. impulsive vs. cognitively effortful). Implications for future research and practice in this novel area of study are indicated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Understanding and Predicting the Process of Software Maintenance Releases

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  2. Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program.

    DOT National Transportation Integrated Search

    2005-01-01

    This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

  3. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  4. Hindsight bias doesn't always come easy: causal models, cognitive effort, and creeping determinism.

    PubMed

    Nestler, Steffen; Blank, Hartmut; von Collani, Gernot

    2008-09-01

    Creeping determinism, a form of hindsight bias, refers to people's hindsight perceptions of events as being determined or inevitable. This article proposes, on the basis of a causal-model theory of creeping determinism, that the underlying processes are effortful, and hence creeping determinism should disappear when individuals lack the cognitive resources to make sense of an outcome. In Experiments 1 and 2, participants were asked to read a scenario while they were under either low or high processing load. Participants who had the cognitive resources to make sense of the outcome perceived it as more probable and necessary than did participants under high processing load or participants who did not receive outcome information. Experiment 3 was designed to separate 2 postulated subprocesses and showed that the attenuating effect of processing load on hindsight bias is not due to a disruption of the retrieval of potential causal antecedents but to a disruption of their evaluation. Together the 3 experiments show that the processes underlying creeping determinism are effortful, and they highlight the crucial role of causal reasoning in the perception of past events. (c) 2008 APA, all rights reserved.

  5. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  6. We favor formal models of heuristics rather than lists of loose dichotomies: a reply to Evans and Over

    PubMed Central

    Gigerenzer, Gerd

    2009-01-01

    In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes. PMID:19784854

  7. Fecal indicator organism modeling and microbial source tracking in environmental waters: Chapter 3.4.6

    USGS Publications Warehouse

    Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.

    2016-01-01

    Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.

  8. New Particle Formation & Growth in CMAQ-NPF: Application of Comprehensive Model Methods to Observations during CalNex & CARES

    EPA Science Inventory

    The formation and growth of new atmospheric ultrafine particles are exceedingly complex processes and recent scientific efforts have grown our understanding of them tremendously. This presentation describes the effort to apply this new knowledge to the CMAQ chemical transport mod...

  9. Integrated modeling of protein-coding genes in the Manduca sexta genome using RNA-Seq data from the biochemical model insect

    PubMed Central

    Cao, Xiaolong; Jiang, Haobo

    2015-01-01

    The genome sequence of Manduca sexta was recently determined using 454 technology. Cufflinks and MAKER2 were used to establish gene models in the genome assembly based on the RNA-Seq data and other species' sequences. Aided by the extensive RNA-Seq data from 50 tissue samples at various life stages, annotators over the world (including the present authors) have manually confirmed and improved a small percentage of the models after spending months of effort. While such collaborative efforts are highly commendable, many of the predicted genes still have problems which may hamper future research on this insect species. As a biochemical model representing lepidopteran pests, M. sexta has been used extensively to study insect physiological processes for over five decades. In this work, we assembled Manduca datasets Cufflinks 3.0, Trinity 4.0, and Oases 4.0 to assist the manual annotation efforts and development of Official Gene Set (OGS) 2.0. To further improve annotation quality, we developed methods to evaluate gene models in the MAKER2, Cufflinks, Oases and Trinity assemblies and selected the best ones to constitute MCOT 1.0 after thorough crosschecking. MCOT 1.0 has 18,089 genes encoding 31,666 proteins: 32.8% match OGS 2.0 models perfectly or near perfectly, 11,747 differ considerably, and 29.5% are absent in OGS 2.0. Future automation of this process is anticipated to greatly reduce human efforts in generating comprehensive, reliable models of structural genes in other genome projects where extensive RNA-Seq data are available. PMID:25612938

  10. Psychological and neural mechanisms associated with effort-related cardiovascular reactivity and cognitive control: An integrative approach.

    PubMed

    Silvestrini, Nicolas

    2017-09-01

    Numerous studies have assessed cardiovascular (CV) reactivity as a measure of effort mobilization during cognitive tasks. However, psychological and neural processes underlying effort-related CV reactivity are still relatively unclear. Previous research reliably found that CV reactivity during cognitive tasks is mainly determined by one region of the brain, the dorsal anterior cingulate cortex (dACC), and that this region is systematically engaged during cognitively demanding tasks. The present integrative approach builds on the research on cognitive control and its brain correlates that shows that dACC function can be related to conflict monitoring and integration of information related to task difficulty and success importance-two key variables in determining effort mobilization. In contrast, evidence also indicates that executive cognitive functioning is processed in more lateral regions of the prefrontal cortex. The resulting model suggests that, when automatic cognitive processes are insufficient to sustain behavior, the dACC determines the amount of required and justified effort according to task difficulty and success importance, which leads to proportional adjustments in CV reactivity and executive cognitive functioning. These propositions are discussed in relation to previous findings on effort-related CV reactivity and cognitive performance, new predictions for future studies, and relevance for other self-regulatory processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.

  12. Perceived distributed effort in team ball sports.

    PubMed

    Beniscelli, Violeta; Tenenbaum, Gershon; Schinke, Robert Joel; Torregrosa, Miquel

    2014-01-01

    In this study, we explored the multifaceted concept of perceived mental and physical effort in team sport contexts where athletes must invest individual and shared efforts to reach a common goal. Semi-structured interviews were conducted with a convenience sample of 15 Catalan professional coaches (3 women and 12 men, 3 each from the following sports: volleyball, basketball, handball, soccer, and water polo) to gain their views of three perceived effort-related dimensions: physical, psychological, and tactical. From a theoretical thematic analysis, it was found that the perception of effort is closely related to how effort is distributed within the team. Moreover, coaches viewed physical effort in relation to the frequency and intensity of the players' involvement in the game. They identified psychological effort in situations where players pay attention to proper cues, and manage emotions under difficult circumstances. Tactical effort addressed the decision-making process of players and how they fulfilled their roles while taking into account the actions of their teammates and opponents. Based on these findings, a model of perceived distributed effort was developed, which delineates the elements that compose each of the aforementioned dimensions. Implications of perceived distributed effort in team coordination and shared mental models are discussed.

  13. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  14. Wave-Sediment Interaction in Muddy Environments: A Field Experiment

    DTIC Science & Technology

    2007-01-01

    in Years 1 and 2 (2007-2008) and a data analysis and modeling effort in Year 3 (2009). 2. “A System for Monitoring Wave-Sediment Interaction in...project was to conduct a pilot field experiment to test instrumentation and data analysis procedures for the major field experiment effort scheduled in...Chou et al., 1993; Foda et al., 1993). With the exception of liquefaction processes, these models assume a single, well- defined mud phase

  15. Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program : executive summary.

    DOT National Transportation Integrated Search

    2005-01-01

    This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

  16. Construction and Updating of Event Models in Auditory Event Processing

    ERIC Educational Resources Information Center

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  17. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Steve

    2011-01-01

    The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.

  18. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  19. Magnetohydrodynamic Particle Acceleration Processes: SSX Experiments, Theory, and Astrophysical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Michael R.

    2006-11-16

    Project Title: Magnetohydrodynamic Particle Acceleration Processes: SSX Experiments, Theory, and Astrophysical Applications PI: Michael R. Brown, Swarthmore College The purpose of the project was to provide theoretical and modeling support to the Swarthmore Spheromak Experiment (SSX). Accordingly, the theoretical effort was tightly integrated into the SSX experimental effort. During the grant period, Michael Brown and his experimental collaborators at Swarthmore, with assistance from W. Matthaeus as appropriate, made substantial progress in understanding the physics SSX plasmas.

  20. Applying Modeling Tools to Ground System Procedures

    NASA Technical Reports Server (NTRS)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  1. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  2. Modeling and Analysis of Global and Regional Climate Change in Relation to Atmospheric Hydrologic Processes

    NASA Technical Reports Server (NTRS)

    Johnson, Donald R.

    1998-01-01

    The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.

  3. Synergies Between Grace and Regional Atmospheric Modeling Efforts

    NASA Astrophysics Data System (ADS)

    Kusche, J.; Springer, A.; Ohlwein, C.; Hartung, K.; Longuevergne, L.; Kollet, S. J.; Keune, J.; Dobslaw, H.; Forootan, E.; Eicker, A.

    2014-12-01

    In the meteorological community, efforts converge towards implementation of high-resolution (< 12km) data-assimilating regional climate modelling/monitoring systems based on numerical weather prediction (NWP) cores. This is driven by requirements of improving process understanding, better representation of land surface interactions, atmospheric convection, orographic effects, and better forecasting on shorter timescales. This is relevant for the GRACE community since (1) these models may provide improved atmospheric mass separation / de-aliasing and smaller topography-induced errors, compared to global (ECMWF-Op, ERA-Interim) data, (2) they inherit high temporal resolution from NWP models, (3) parallel efforts towards improving the land surface component and coupling groundwater models; this may provide realistic hydrological mass estimates with sub-diurnal resolution, (4) parallel efforts towards re-analyses, with the aim of providing consistent time series. (5) On the other hand, GRACE can help validating models and aids in the identification of processes needing improvement. A coupled atmosphere - land surface - groundwater modelling system is currently being implemented for the European CORDEX region at 12.5 km resolution, based on the TerrSysMP platform (COSMO-EU NWP, CLM land surface and ParFlow groundwater models). We report results from Springer et al. (J. Hydromet., accept.) on validating the water cycle in COSMO-EU using GRACE and precipitation, evapotranspiration and runoff data; confirming that the model does favorably at representing observations. We show that after GRACE-derived bias correction, basin-average hydrological conditions prior to 2002 can be reconstructed better than before. Next, comparing GRACE with CLM forced by EURO-CORDEX simulations allows identifying processes needing improvement in the model. Finally, we compare COSMO-EU atmospheric pressure, a proxy for mass corrections in satellite gravimetry, with ERA-Interim over Europe at timescales shorter/longer than 1 month, and spatial scales below/above ERA resolution. We find differences between regional and global model more pronounced at high frequencies, with magnitude at sub-grid scale and larger scale corresponding to 1-3 hPa (1-3 cm EWH); relevant for the assessment of post-GRACE concepts.

  4. Southeast Atmosphere Studies: learning from model-observation syntheses

    NASA Astrophysics Data System (ADS)

    Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu; McNeill, V. Faye; Tsigaridis, Kostas; McDonald, Brian C.; Warneke, Carsten; Guenther, Alex; Alvarado, Matthew J.; de Gouw, Joost; Mickley, Loretta J.; Leibensperger, Eric M.; Mathur, Rohit; Nolte, Christopher G.; Portmann, Robert W.; Unger, Nadine; Tosca, Mika; Horowitz, Larry W.

    2018-02-01

    Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales.This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.

  5. Southeast Atmosphere Studies: learning from model-observation syntheses

    PubMed Central

    Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu; McNeill, V. Faye; Tsigaridis, Kostas; McDonald, Brian C.; Warneke, Carsten; Guenther, Alex; Alvarado, Matthew J.; de Gouw, Joost; Mickley, Loretta J.; Leibensperger, Eric M.; Mathur, Rohit; Nolte, Christopher G.; Portmann, Robert W.; Unger, Nadine; Tosca, Mika; Horowitz, Larry W.

    2018-01-01

    Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales. This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.

  6. Southeast Atmosphere Studies: Learning from Model-Observation Syntheses

    NASA Technical Reports Server (NTRS)

    Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu; hide

    2018-01-01

    Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales. This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.

  7. Autonomy Support and Intrinsic Goal Progress Expectancy and Its Links to Longitudinal Study Effort and Subjective Wellbeing: The Differential Mediating Effect of Intrinsic and Identified Regulations and the Moderator Effects of Effort and Intrinsic Goals

    ERIC Educational Resources Information Center

    Waaler, Rune; Halvari, Halgeir; Skjesol, Knut; Bagoien, Tor Egil

    2013-01-01

    The authors tested a self-determination theory (Deci & Ryan, 2000) process model of subjective wellbeing among students at Norwegian Folk High Schools. In this model the authors hypothesized that students' intrinsic goal progress expectancy in the chosen study activity and perceived autonomy support from teachers would be positively associated…

  8. Simulation and Modeling Efforts to Support Decision Making in Healthcare Supply Chain Management

    PubMed Central

    Lazarova-Molnar, Sanja

    2014-01-01

    Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges. PMID:24683333

  9. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  10. Higher Order Chemistry Models in the CFD Simulation of Laser-Ablated Carbon Plumes

    NASA Technical Reports Server (NTRS)

    Greendyke, R. B.; Creel, J. R.; Payne, B. T.; Scott, C. D.

    2005-01-01

    Production of single-walled carbon nanotubes (SWNT) has taken place for a number of years and by a variety of methods such as laser ablation, chemical vapor deposition, and arc-jet ablation. Yet, little is actually understood about the exact chemical kinetics and processes that occur in SWNT formation. In recent time, NASA Johnson Space Center has devoted a considerable effort to the experimental evaluation of the laser ablation production process for SWNT originally developed at Rice University. To fully understand the nature of the laser ablation process it is necessary to understand the development of the carbon plume dynamics within the laser ablation oven. The present work is a continuation of previous studies into the efforts to model plume dynamics using computational fluid dynamics (CFD). The ultimate goal of the work is to improve understanding of the laser ablation process, and through that improved understanding, refine the laser ablation production of SWNT.

  11. The Influence of Effortful Thought and Cognitive Proficiencies on the Conjunction Fallacy: Implications for Dual-Process Theories of Reasoning and Judgment.

    PubMed

    Scherer, Laura D; Yates, J Frank; Baker, S Glenn; Valentine, Kathrene D

    2017-06-01

    Human judgment often violates normative standards, and virtually no judgment error has received as much attention as the conjunction fallacy. Judgment errors have historically served as evidence for dual-process theories of reasoning, insofar as these errors are assumed to arise from reliance on a fast and intuitive mental process, and are corrected via effortful deliberative reasoning. In the present research, three experiments tested the notion that conjunction errors are reduced by effortful thought. Predictions based on three different dual-process theory perspectives were tested: lax monitoring, override failure, and the Tripartite Model. Results indicated that participants higher in numeracy were less likely to make conjunction errors, but this association only emerged when participants engaged in two-sided reasoning, as opposed to one-sided or no reasoning. Confidence was higher for incorrect as opposed to correct judgments, suggesting that participants were unaware of their errors.

  12. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  13. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.

    PubMed

    Vassena, Eliana; Holroyd, Clay B; Alexander, William H

    2017-01-01

    In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  14. The Nucleation and Growth of Protein Crystals

    NASA Technical Reports Server (NTRS)

    Pusey, Marc

    2004-01-01

    Obtaining crystals of suitable size and high quality continues to be a major bottleneck in macromolecular crystallography. Currently, structural genomics efforts are achieving on average about a 10% success rate in going from purified protein to a deposited crystal structure. Growth of crystals in microgravity was proposed as a means of overcoming size and quality problems, which subsequently led to a major NASA effort in microgravity crystal growth, with the agency also funding research into understanding the process. Studies of the macromolecule crystal nucleation and growth process were carried out in a number of labs in an effort to understand what affected the resultant crystal quality on Earth, and how microgravity improved the process. Based upon experimental evidence, as well as simple starting assumptions, we have proposed that crystal nucleation occurs by a series of discrete self assembly steps, which 'set' the underlying crystal symmetry. This talk will review the model developed, and its origins, in our laboratory for how crystals nucleate and grow, and will then present, along with preliminary data, how we propose to use this model to improve the success rate for obtaining crystals from a given protein.

  15. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    PubMed

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  16. Integrated modeling of protein-coding genes in the Manduca sexta genome using RNA-Seq data from the biochemical model insect.

    PubMed

    Cao, Xiaolong; Jiang, Haobo

    2015-07-01

    The genome sequence of Manduca sexta was recently determined using 454 technology. Cufflinks and MAKER2 were used to establish gene models in the genome assembly based on the RNA-Seq data and other species' sequences. Aided by the extensive RNA-Seq data from 50 tissue samples at various life stages, annotators over the world (including the present authors) have manually confirmed and improved a small percentage of the models after spending months of effort. While such collaborative efforts are highly commendable, many of the predicted genes still have problems which may hamper future research on this insect species. As a biochemical model representing lepidopteran pests, M. sexta has been used extensively to study insect physiological processes for over five decades. In this work, we assembled Manduca datasets Cufflinks 3.0, Trinity 4.0, and Oases 4.0 to assist the manual annotation efforts and development of Official Gene Set (OGS) 2.0. To further improve annotation quality, we developed methods to evaluate gene models in the MAKER2, Cufflinks, Oases and Trinity assemblies and selected the best ones to constitute MCOT 1.0 after thorough crosschecking. MCOT 1.0 has 18,089 genes encoding 31,666 proteins: 32.8% match OGS 2.0 models perfectly or near perfectly, 11,747 differ considerably, and 29.5% are absent in OGS 2.0. Future automation of this process is anticipated to greatly reduce human efforts in generating comprehensive, reliable models of structural genes in other genome projects where extensive RNA-Seq data are available. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Demographics of reintroduced populations: estimation, modeling, and decision analysis

    USGS Publications Warehouse

    Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.

    2013-01-01

    Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.

  18. Objective Assessment of Listening Effort: Coregistration of Pupillometry and EEG.

    PubMed

    Miles, Kelly; McMahon, Catherine; Boisvert, Isabelle; Ibrahim, Ronny; de Lissa, Peter; Graham, Petra; Lyxell, Björn

    2017-01-01

    Listening to speech in noise is effortful, particularly for people with hearing impairment. While it is known that effort is related to a complex interplay between bottom-up and top-down processes, the cognitive and neurophysiological mechanisms contributing to effortful listening remain unknown. Therefore, a reliable physiological measure to assess effort remains elusive. This study aimed to determine whether pupil dilation and alpha power change, two physiological measures suggested to index listening effort, assess similar processes. Listening effort was manipulated by parametrically varying spectral resolution (16- and 6-channel noise vocoding) and speech reception thresholds (SRT; 50% and 80%) while 19 young, normal-hearing adults performed a speech recognition task in noise. Results of off-line sentence scoring showed discrepancies between the target SRTs and the true performance obtained during the speech recognition task. For example, in the SRT80% condition, participants scored an average of 64.7%. Participants' true performance levels were therefore used for subsequent statistical modelling. Results showed that both measures appeared to be sensitive to changes in spectral resolution (channel vocoding), while pupil dilation only was also significantly related to their true performance levels (%) and task accuracy (i.e., whether the response was correctly or partially recalled). The two measures were not correlated, suggesting they each may reflect different cognitive processes involved in listening effort. This combination of findings contributes to a growing body of research aiming to develop an objective measure of listening effort.

  19. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  20. Hydrological processes and model representation: impact of soft data on calibration

    Treesearch

    J.G. Arnold; M.A. Youssef; H. Yen; M.J. White; A.Y. Sheshukov; A.M. Sadeghi; D.N. Moriasi; J.L. Steiner; Devendra Amatya; R.W. Skaggs; E.B. Haney; J. Jeong; M. Arabi; P.H. Gowda

    2015-01-01

    Hydrologic and water quality models are increasingly used to determine the environmental impacts of climate variability and land management. Due to differing model objectives and differences in monitored data, there are currently no universally accepted procedures for model calibration and validation in the literature. In an effort to develop accepted model calibration...

  1. Implementation of new pavement performance prediction models in PMIS : report

    DOT National Transportation Integrated Search

    2012-08-01

    Pavement performance prediction models and maintenance and rehabilitation (M&R) optimization processes : enable managers and engineers to plan and prioritize pavement M&R activities in a cost-effective manner. : This report describes TxDOTs effort...

  2. Freight data architecture business process, logical data model, and physical data model.

    DOT National Transportation Integrated Search

    2014-09-01

    This document summarizes the study teams efforts to establish data-sharing partnerships : and relay the lessons learned. In addition, it provides information on a prototype freight data : architecture and supporting description and specifications ...

  3. An explanatory model of academic achievement based on aptitudes, goal orientations, self-concept and learning strategies.

    PubMed

    Miñano Pérez, Pablo; Castejón Costa, Juan-Luis; Gilar Corbí, Raquel

    2012-03-01

    As a result of studies examining factors involved in the learning process, various structural models have been developed to explain the direct and indirect effects that occur between the variables in these models. The objective was to evaluate a structural model of cognitive and motivational variables predicting academic achievement, including general intelligence, academic self-concept, goal orientations, effort and learning strategies. The sample comprised of 341 Spanish students in the first year of compulsory secondary education. Different tests and questionnaires were used to evaluate each variable, and Structural Equation Modelling (SEM) was applied to contrast the relationships of the initial model. The model proposed had a satisfactory fit, and all the hypothesised relationships were significant. General intelligence was the variable most able to explain academic achievement. Also important was the direct influence of academic self-concept on achievement, goal orientations and effort, as well as the mediating ability of effort and learning strategies between academic goals and final achievement.

  4. A coupled modelling effort to study the fate of contaminated sediments downstream of the Coles Hill deposit, Virginia, USA

    NASA Astrophysics Data System (ADS)

    Castro-Bolinaga, C. F.; Zavaleta, E. R.; Diplas, P.

    2015-03-01

    This paper presents the preliminary results of a coupled modelling effort to study the fate of tailings (radioactive waste-by product) downstream of the Coles Hill uranium deposit located in Virginia, USA. The implementation of the overall modelling process includes a one-dimensional hydraulic model to qualitatively characterize the sediment transport process under severe flooding conditions downstream of the potential mining site, a two-dimensional ANSYS Fluent model to simulate the release of tailings from a containment cell located partially above the local ground surface into the nearby streams, and a one-dimensional finite-volume sediment transport model to examine the propagation of a tailings sediment pulse in the river network located downstream. The findings of this investigation aim to assist in estimating the potential impacts that tailings would have if they were transported into rivers and reservoirs located downstream of the Coles Hill deposit that serve as municipal drinking water supplies.

  5. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  6. A survey of Applied Psychological Services' models of the human operator

    NASA Technical Reports Server (NTRS)

    Siegel, A. I.; Wolf, J. J.

    1979-01-01

    A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.

  7. Using large-scale genome variation cohorts to decipher the molecular mechanism of cancer.

    PubMed

    Habermann, Nina; Mardin, Balca R; Yakneen, Sergei; Korbel, Jan O

    2016-01-01

    Characterizing genomic structural variations (SVs) in the human genome remains challenging, and there is a growing interest to understand somatic SVs occurring in cancer, a disease of the genome. A havoc-causing SV process known as chromothripsis scars the genome when localized chromosome shattering and repair occur in a one-off catastrophe. Recent efforts led to the development of a set of conceptual criteria for the inference of chromothripsis events in cancer genomes and to the development of experimental model systems for studying this striking DNA alteration process in vitro. We discuss these approaches, and additionally touch upon current "Big Data" efforts that employ hybrid cloud computing to enable studies of numerous cancer genomes in an effort to search for commonalities and differences in molecular DNA alteration processes in cancer. Copyright © 2016. Published by Elsevier SAS.

  8. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    PubMed Central

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  9. NASTRAN analysis of Tokamak vacuum vessel using interactive graphics

    NASA Technical Reports Server (NTRS)

    Miller, A.; Badrian, M.

    1978-01-01

    Isoparametric quadrilateral and triangular elements were used to represent the vacuum vessel shell structure. For toroidally symmetric loadings, MPCs were employed across model boundaries and rigid format 24 was invoked. Nonsymmetric loadings required the use of the cyclic symmetry analysis available with rigid format 49. NASTRAN served as an important analysis tool in the Tokamak design effort by providing a reliable means for assessing structural integrity. Interactive graphics were employed in the finite element model generation and in the post-processing of results. It was felt that model generation and checkout with interactive graphics reduced the modelling effort and debugging man-hours significantly.

  10. Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Hyperspectral Data

    DTIC Science & Technology

    2003-09-30

    Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer ( GIFTS ) Hyperspectral Data Dr. Allen H.-L. Huang...ssec.wisc.edu Award Number: N000140110850 Grant Number: 144KE70 http://www.ssec.wisc.edu/ gifts /navy/ LONG-TERM GOALS This Office of Naval...objective of this DoD research effort is to develop and demonstrate a fully functional GIFTS hyperspectral data processing system with the potential for a

  11. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  12. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  13. Correcting Inadequate Model Snow Process Descriptions Dramatically Improves Mountain Hydrology Simulations

    NASA Astrophysics Data System (ADS)

    Pomeroy, J. W.; Fang, X.

    2014-12-01

    The vast effort in hydrology devoted to parameter calibration as a means to improve model performance assumes that the models concerned are not fundamentally wrong. By focussing on finding optimal parameter sets and ascribing poor model performance to parameter or data uncertainty, these efforts may fail to consider the need to improve models with more intelligent descriptions of hydrological processes. To test this hypothesis, a flexible physically based hydrological model including a full suite of snow hydrology processes as well as warm season, hillslope and groundwater hydrology was applied to Marmot Creek Research Basin, Canadian Rocky Mountains where excellent driving meteorology and basin biophysical descriptions exist. Model parameters were set from values found in the basin or from similar environments; no parameters were calibrated. The model was tested against snow surveys and streamflow observations. The model used algorithms that describe snow redistribution, sublimation and forest canopy effects on snowmelt and evaporative processes that are rarely implemented in hydrological models. To investigate the contribution of these processes to model predictive capability, the model was "falsified" by deleting parameterisations for forest canopy snow mass and energy, blowing snow, intercepted rain evaporation, and sublimation. Model falsification by ignoring forest canopy processes contributed to a large increase in SWE errors for forested portions of the research basin with RMSE increasing from 19 to 55 mm and mean bias (MB) increasing from 0.004 to 0.62. In the alpine tundra portion, removing blowing processes resulted in an increase in model SWE MB from 0.04 to 2.55 on north-facing slopes and -0.006 to -0.48 on south-facing slopes. Eliminating these algorithms degraded streamflow prediction with the Nash Sutcliffe efficiency dropping from 0.58 to 0.22 and MB increasing from 0.01 to 0.09. These results show dramatic model improvements by including snow redistribution and melt processes associated with wind transport and forest canopies. As most hydrological models do not currently include these processes, it is suggested that modellers first improve the realism of model structures before trying to optimise what are inherently inadequate simulations of hydrology.

  14. Diving into Cold Pools

    NASA Astrophysics Data System (ADS)

    van den Heever, S. C.; Grant, L. D.; Drager, A. J.

    2017-12-01

    Cold pools play a significant role in convective storm initiation, organization and longevity. Given their role in convective life cycles, recent efforts have been focused on improving the representation of cold pool processes within weather forecast models, as well as on developing cold pool parameterizations in order to better represent their impacts within global climate models. Understanding the physical processes governing cold pool formation, intensity and dissipation is therefore critical to these efforts. Cold pool characteristics are influenced by numerous factors, including those associated with precipitation formation and evaporation, variations in the environmental moisture and shear, and land surface interactions. The focus of this talk will be on the manner in which the surface characteristics and associated processes impact cold pool genesis and dissipation. In particular, the results from high-resolution modeling studies focusing on the role of sensible and latent heat fluxes, soil moisture and SST will be presented. The results from a recent field campaign examining cold pools over northern Colorado will also be discussed.

  15. The DEPICT model for participatory qualitative health promotion research analysis piloted in Canada, Zambia and South Africa

    PubMed Central

    Flicker, Sarah; Nixon, Stephanie A.

    2015-01-01

    Health promotion researchers are increasingly conducting Community-Based Participatory Research in an effort to reduce health disparities. Despite efforts towards greater inclusion, research teams continue to regularly exclude diverse representation from data analysis efforts. The DEPICT model for collaborative qualitative analysis is a democratic approach to enhancing rigour through inclusion of diverse stakeholders. It is broken down into six sequential steps. Strong leadership, coordination and facilitation skills are needed; however, the process is flexible enough to adapt to most environments and varying levels of expertise. Including diverse stakeholders on an analysis team can enrich data analysis and provide more nuanced understandings of complicated health problems. PMID:24418997

  16. Information Technology. DOD Needs to Strengthen Management of Its Statutorily Mandated Software and System Process Improvement Efforts

    DTIC Science & Technology

    2009-09-01

    NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI

  17. Nonlinear ultrasonic pulsed measurements and applications to metal processing and fatigue

    NASA Astrophysics Data System (ADS)

    Yost, William T.; Cantrell, John H.; Na, Jeong K.

    2001-04-01

    Nonlinear ultrasonics research at NASA-Langley Research Center emphasizes development of experimental techniques and modeling, with applications to metal fatigue and metals processing. This review work includes a summary of results from our recent efforts in technique refinement, modeling of fatigue related microstructure contributions, and measurements on fatigued turbine blades. Also presented are data on 17-4PH and 410-Cb stainless steels. The results are in good agreement with the models.

  18. The efficacy of using inventory data to develop optimal diameter increment models

    Treesearch

    Don C. Bragg

    2002-01-01

    Most optimal tree diameter growth models have arisen through either the conceptualization of physiological processes or the adaptation of empirical increment models. However, surprisingly little effort has been invested in the melding of these approaches even though it is possible to develop theoretically sound, computationally efficient optimal tree growth models...

  19. Parameter Estimates in Differential Equation Models for Chemical Kinetics

    ERIC Educational Resources Information Center

    Winkel, Brian

    2011-01-01

    We discuss the need for devoting time in differential equations courses to modelling and the completion of the modelling process with efforts to estimate the parameters in the models using data. We estimate the parameters present in several differential equation models of chemical reactions of order n, where n = 0, 1, 2, and apply more general…

  20. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  1. Humor Facilitates Text Comprehension: Evidence from Eye Movements

    ERIC Educational Resources Information Center

    Ferstl, Evelyn C.; Israel, Laura; Putzar, Lisa

    2017-01-01

    One crucial property of verbal jokes is that the punchline usually contains an incongruency that has to be resolved by updating the situation model representation. In the standard pragmatic model, these processes are considered to require cognitive effort. However, only few studies compared jokes to texts requiring a situation model revision…

  2. A Comprehensive Expectancy Motivation Model: Implications for Adult Education and Training.

    ERIC Educational Resources Information Center

    Howard, Kenneth W.

    1989-01-01

    The Comprehensive Expectancy Motivation Model is based on valence-instrumentality-expectancy theory. It describes expectancy motivation as part of a larger process that includes past experience, motivation, effort, performance, reward, and need satisfaction. The model has significant implications for the design, marketing, and delivery of adult…

  3. How Meaning Is Born.

    ERIC Educational Resources Information Center

    Hunt, Madgie Mae

    In an effort to create a multilevel, interactive, and hypothesis-based model of the reading comprehension process that bridges interdisciplinary gaps in the theory of learning, this report focuses on descriptions of cognitive processes developed in the fields of cognitive psychology, artificial intelligence, sociolinguistics, linguistics, and…

  4. Thermodynamic analysis and subscale modeling of space-based orbit transfer vehicle cryogenic propellant resupply

    NASA Technical Reports Server (NTRS)

    Defelice, David M.; Aydelott, John C.

    1987-01-01

    The resupply of the cryogenic propellants is an enabling technology for spacebased orbit transfer vehicles. As part of the NASA Lewis ongoing efforts in microgravity fluid management, thermodynamic analysis and subscale modeling techniques were developed to support an on-orbit test bed for cryogenic fluid management technologies. Analytical results have shown that subscale experimental modeling of liquid resupply can be used to validate analytical models when the appropriate target temperature is selected to relate the model to its prototype system. Further analyses were used to develop a thermodynamic model of the tank chilldown process which is required prior to the no-vent fill operation. These efforts were incorporated into two FORTRAN programs which were used to present preliminary analyticl results.

  5. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  6. Creative Thinking Processes: The Past and the Future

    ERIC Educational Resources Information Center

    Mumford, Michael D.; McIntosh, Tristan

    2017-01-01

    For more than one hundred years, students of creativity, including seminal efforts published in the "Journal of Creative Behavior," have sought to identify the key processes people must execute to produce creative problem solutions. In recent years, we have seen a consensual model of key creative thinking processes being accepted by the…

  7. Activational and effort-related aspects of motivation: neural mechanisms and implications for psychopathology

    PubMed Central

    Yohn, Samantha E.; López-Cruz, Laura; San Miguel, Noemí; Correa, Mercè

    2016-01-01

    Abstract Motivation has been defined as the process that allows organisms to regulate their internal and external environment, and control the probability, proximity and availability of stimuli. As such, motivation is a complex process that is critical for survival, which involves multiple behavioural functions mediated by a number of interacting neural circuits. Classical theories of motivation suggest that there are both directional and activational aspects of motivation, and activational aspects (i.e. speed and vigour of both the instigation and persistence of behaviour) are critical for enabling organisms to overcome work-related obstacles or constraints that separate them from significant stimuli. The present review discusses the role of brain dopamine and related circuits in behavioural activation, exertion of effort in instrumental behaviour, and effort-related decision-making, based upon both animal and human studies. Impairments in behavioural activation and effort-related aspects of motivation are associated with psychiatric symptoms such as anergia, fatigue, lassitude and psychomotor retardation, which cross multiple pathologies, including depression, schizophrenia, and Parkinson’s disease. Therefore, this review also attempts to provide an interdisciplinary approach that integrates findings from basic behavioural neuroscience, behavioural economics, clinical neuropsychology, psychiatry, and neurology, to provide a coherent framework for future research and theory in this critical field. Although dopamine systems are a critical part of the brain circuitry regulating behavioural activation, exertion of effort, and effort-related decision-making, mesolimbic dopamine is only one part of a distributed circuitry that includes multiple neurotransmitters and brain areas. Overall, there is a striking similarity between the brain areas involved in behavioural activation and effort-related processes in rodents and in humans. Animal models of effort-related decision-making are highly translatable to humans, and an emerging body of evidence indicates that alterations in effort-based decision-making are evident in several psychiatric and neurological disorders. People with major depression, schizophrenia, and Parkinson’s disease show evidence of decision-making biases towards a lower exertion of effort. Translational studies linking research with animal models, human volunteers, and clinical populations are greatly expanding our knowledge about the neural basis of effort-related motivational dysfunction, and it is hoped that this research will ultimately lead to improved treatment for motivational and psychomotor symptoms in psychiatry and neurology. PMID:27189581

  8. Technology transfer into the solid propulsion industry

    NASA Technical Reports Server (NTRS)

    Campbell, Ralph L.; Thomson, Lawrence J.

    1995-01-01

    This paper is a survey of the waste minimization efforts of industries outside of aerospace for possible applications in the manufacture of solid rocket motors (SRM) for NASA. The Redesigned Solid Rocket Motor (RSRM) manufacturing plan was used as the model for processes involved in the production of an SRM. A literature search was conducted to determine the recycling, waste minimization, and waste treatment methods used in the commercial sector that might find application in SRM production. Manufacturers, trade organizations, and professional associations were also contacted. Waste minimization efforts for current processes and replacement technologies, which might reduce the amount or severity of the wastes generated in SRM production, were investigated. An overview of the results of this effort are presented in this paper.

  9. Effort Deficits and Depression: The Influence of Anhedonic Depressive Symptoms on Cardiac Autonomic Activity During a Mental Challenge

    PubMed Central

    Silvia, Paul J.; Nusbaum, Emily C.; Eddington, Kari M.; Beaty, Roger E.; Kwapil, Thomas R.

    2014-01-01

    Motivational approaches to depression emphasize the role of dysfunctional motivational dynamics, particularly diminished reward and incentive processes associated with anhedonia. A study examined how anhedonic depressive symptoms, measured continuously across a wide range of severity, influenced the physiological mobilization of effort during a cognitive task. Using motivational intensity theory as a guide, we expected that the diminished incentive value associated with anhedonic depressive symptoms would reduce effort during a “do your best” challenge (also known as an unfixed or self-paced challenge), in which effort is a function of the value of achieving the task’s goal. Using impedance cardiography, two cardiac autonomic responses were assessed: pre-ejection period (PEP), a measure of sympathetic activity and our primary measure of interest, and respiratory sinus arrhythmia (RSA), a measure of parasympathetic activity. As expected, PEP slowed from baseline to task as anhedonic depressive symptoms increased (as measured with the DASS Depression scale), indicating diminished effort-related sympathetic activity. No significant effects appeared for RSA. The findings support motivational intensity theory as a translational model of effort processes in depression and clarify some inconsistent effects of depressive symptoms on effort-related physiology found in past work. PMID:25431505

  10. DebriSat Fragment Characterization System and Processing Status

    NASA Technical Reports Server (NTRS)

    Rivero, M.; Shiotani, B.; M. Carrasquilla; Fitz-Coy, N.; Liou, J. C.; Sorge, M.; Huynh, T.; Opiela, J.; Krisko, P.; Cowardin, H.

    2016-01-01

    The DebriSat project is a continuing effort sponsored by NASA and DoD to update existing break-up models using data obtained from hypervelocity impact tests performed to simulate on-orbit collisions. After the impact tests, a team at the University of Florida has been working to characterize the fragments in terms of their mass, size, shape, color and material content. The focus of the post-impact effort has been the collection of 2 mm and larger fragments resulting from the hypervelocity impact test. To date, in excess of 125K fragments have been recovered which is approximately 40K more than the 85K fragments predicted by the existing models. While the fragment collection activities continue, there has been a transition to the characterization of the recovered fragments. Since the start of the characterization effort, the focus has been on the use of automation to (i) expedite the fragment characterization process and (ii) minimize the effects of human subjectivity on the results; e.g., automated data entry processes were developed and implemented to minimize errors during transcription of the measurement data. At all steps of the process, however, there is human oversight to ensure the integrity of the data. Additionally, repeatability and reproducibility tests have been developed and implemented to ensure that the instrumentations used in the characterization process are accurate and properly calibrated.

  11. A Model of Auditory-Cognitive Processing and Relevance to Clinical Applicability.

    PubMed

    Edwards, Brent

    2016-01-01

    Hearing loss and cognitive function interact in both a bottom-up and top-down relationship. Listening effort is tied to these interactions, and models have been developed to explain their relationship. The Ease of Language Understanding model in particular has gained considerable attention in its explanation of the effect of signal distortion on speech understanding. Signal distortion can also affect auditory scene analysis ability, however, resulting in a distorted auditory scene that can affect cognitive function, listening effort, and the allocation of cognitive resources. These effects are explained through an addition to the Ease of Language Understanding model. This model can be generalized to apply to all sounds, not only speech, representing the increased effort required for auditory environmental awareness and other nonspeech auditory tasks. While the authors have measures of speech understanding and cognitive load to quantify these interactions, they are lacking measures of the effect of hearing aid technology on auditory scene analysis ability and how effort and attention varies with the quality of an auditory scene. Additionally, the clinical relevance of hearing aid technology on cognitive function and the application of cognitive measures in hearing aid fittings will be limited until effectiveness is demonstrated in real-world situations.

  12. Neurocomputational mechanisms underlying subjective valuation of effort costs

    PubMed Central

    Giehl, Kathrin; Sillence, Annie

    2017-01-01

    In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892

  13. Modeling gypsy moth seasonality

    Treesearch

    J. A. Logan; D. R. Gray

    1991-01-01

    Maintaining an appropriate seasonality is perhaps the most basic ecological requisite for insects living in temperate environments. The basic ecological importance of seasonality is enough to justify expending considerable effort to accurately model the processes involved. For insects of significant economic consequence, seasonality assumes additional importance...

  14. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  15. Infrared Algorithm Development for Ocean Observations with EOS/MODIS

    NASA Technical Reports Server (NTRS)

    Brown, Otis B.

    1997-01-01

    Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.

  16. A review of nuclear thermal propulsion carbide fuel corrosion and key issues

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; El-Genk, Mohamed S.

    1994-01-01

    Corrosion (mass loss) of carbide nuclear fuels due to their exposure to hot hydrogen in nuclear thermal propulsion engine systems greatly impacts the performance, thrust-to-weight and life of such systems. This report provides an overview of key issues and processes associated with the corrosion of carbide materials. Additionally, past pertinent development reactor test observations, as well as related experimental work and analysis modeling efforts are reviewed. At the conclusion, recommendations are presented, which provide the foundation for future corrosion modeling and verification efforts.

  17. An adapted yield criterion for the evolution of subsequent yield surfaces

    NASA Astrophysics Data System (ADS)

    Küsters, N.; Brosius, A.

    2017-09-01

    In numerical analysis of sheet metal forming processes, the anisotropic material behaviour is often modelled with isotropic work hardening and an average Lankford coefficient. In contrast, experimental observations show an evolution of the Lankford coefficients, which can be associated with a yield surface change due to kinematic and distortional hardening. Commonly, extensive efforts are carried out to describe these phenomena. In this paper an isotropic material model based on the Yld2000-2d criterion is adapted with an evolving yield exponent in order to change the yield surface shape. The yield exponent is linked to the accumulative plastic strain. This change has the effect of a rotating yield surface normal. As the normal is directly related to the Lankford coefficient, the change can be used to model the evolution of the Lankford coefficient during yielding. The paper will focus on the numerical implementation of the adapted material model for the FE-code LS-Dyna, mpi-version R7.1.2-d. A recently introduced identification scheme [1] is used to obtain the parameters for the evolving yield surface and will be briefly described for the proposed model. The suitability for numerical analysis will be discussed for deep drawing processes in general. Efforts for material characterization and modelling will be compared to other common yield surface descriptions. Besides experimental efforts and achieved accuracy, the potential of flexibility in material models and the risk of ambiguity during identification are of major interest in this paper.

  18. Use of a business excellence model to improve conservation programs.

    PubMed

    Black, Simon; Groombridge, Jim

    2010-12-01

    The current shortfall in effectiveness within conservation biology is illustrated by increasing interest in "evidence-based conservation," whose proponents have identified the need to benchmark conservation initiatives against actions that lead to proven positive effects. The effectiveness of conservation policies, approaches, and evaluation is under increasing scrutiny, and in these areas models of excellence used in business could prove valuable. Typically, conservation programs require years of effort and involve rigorous long-term implementation processes. Successful balance of long-term efforts alongside the achievement of short-term goals is often compromised by management or budgetary constraints, a situation also common in commercial businesses. "Business excellence" is an approach many companies have used over the past 20 years to ensure continued success. Various business excellence evaluations have been promoted that include concepts that could be adapted and applied in conservation programs. We describe a conservation excellence model that shows how scientific processes and results can be aligned with financial and organizational measures of success. We applied the model to two well-documented species conservation programs. In the first, the Po'ouli program, several aspects of improvement were identified, such as more authority for decision making in the field and better integration of habitat management and population recovery processes. The second example, the black-footed ferret program, could have benefited from leadership effort to reduce bureaucracy and to encourage use of best-practice species recovery approaches. The conservation excellence model enables greater clarity in goal setting, more-effective identification of job roles within programs, better links between technical approaches and measures of biological success, and more-effective use of resources. The model could improve evaluation of a conservation program's effectiveness and may be used to compare different programs, for example during reviews of project performance by sponsoring organizations. © 2010 Society for Conservation Biology.

  19. The DEPICT model for participatory qualitative health promotion research analysis piloted in Canada, Zambia and South Africa.

    PubMed

    Flicker, Sarah; Nixon, Stephanie A

    2015-09-01

    Health promotion researchers are increasingly conducting Community-Based Participatory Research in an effort to reduce health disparities. Despite efforts towards greater inclusion, research teams continue to regularly exclude diverse representation from data analysis efforts. The DEPICT model for collaborative qualitative analysis is a democratic approach to enhancing rigour through inclusion of diverse stakeholders. It is broken down into six sequential steps. Strong leadership, coordination and facilitation skills are needed; however, the process is flexible enough to adapt to most environments and varying levels of expertise. Including diverse stakeholders on an analysis team can enrich data analysis and provide more nuanced understandings of complicated health problems. © The Author (2014). Published by Oxford University Press.

  20. Observation-Oriented Modeling: Going beyond "Is It All a Matter of Chance"?

    ERIC Educational Resources Information Center

    Grice, James W.; Yepez, Maria; Wilson, Nicole L.; Shoda, Yuichi

    2017-01-01

    An alternative to null hypothesis significance testing is presented and discussed. This approach, referred to as observation-oriented modeling, is centered on model building in an effort to explicate the structures and processes believed to generate a set of observations. In terms of analysis, this novel approach complements traditional methods…

  1. Landscape-based population viability models demonstrate importance of strategic conservation planning for birds

    Treesearch

    Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh; D. Todd Jones-Farland

    2013-01-01

    Efforts to conserve regional biodiversity in the face of global climate change, habitat loss and fragmentation will depend on approaches that consider population processes at multiple scales. By combining habitat and demographic modeling, landscape-based population viability models effectively relate small-scale habitat and landscape patterns to regional population...

  2. Multi-scale Modeling of Chromosomal DNA in Living Cells

    NASA Astrophysics Data System (ADS)

    Spakowitz, Andrew

    The organization and dynamics of chromosomal DNA play a pivotal role in a range of biological processes, including gene regulation, homologous recombination, replication, and segregation. Establishing a quantitative theoretical model of DNA organization and dynamics would be valuable in bridging the gap between the molecular-level packaging of DNA and genome-scale chromosomal processes. Our research group utilizes analytical theory and computational modeling to establish a predictive theoretical model of chromosomal organization and dynamics. In this talk, I will discuss our efforts to develop multi-scale polymer models of chromosomal DNA that are both sufficiently detailed to address specific protein-DNA interactions while capturing experimentally relevant time and length scales. I will demonstrate how these modeling efforts are capable of quantitatively capturing aspects of behavior of chromosomal DNA in both prokaryotic and eukaryotic cells. This talk will illustrate that capturing dynamical behavior of chromosomal DNA at various length scales necessitates a range of theoretical treatments that accommodate the critical physical contributions that are relevant to in vivo behavior at these disparate length and time scales. National Science Foundation, Physics of Living Systems Program (PHY-1305516).

  3. OPC model generation procedure for different reticle vendors

    NASA Astrophysics Data System (ADS)

    Jost, Andrew M.; Belova, Nadya; Callan, Neal P.

    2003-12-01

    The challenge of delivering acceptable semiconductor products to customers in timely fashion becomes more difficult as design complexity increases. The requirements of current generation designs tax OPC engineers greater than ever before since the readiness of high-quality OPC models can delay new process qualifications or lead to respins, which add to the upward-spiraling costs of new reticle sets, extend time-to-market, and disappoint customers. In their efforts to extend the printability of new designs, OPC engineers generally focus on the data-to-wafer path, ignoring data-to-mask effects almost entirely. However, it is unknown whether reticle makers' disparate processes truly yield comparable reticles, even with identical tools. This approach raises the question of whether a single OPC model is applicable to all reticle vendors. LSI Logic has developed a methodology for quantifying vendor-to-vendor reticle manufacturing differences and adapting OPC models for use at several reticle vendors. This approach allows LSI Logic to easily adapt existing OPC models for use with several reticle vendors and obviates the generation of unnecessary models, allowing OPC engineers to focus their efforts on the most critical layers.

  4. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  5. Determining the compactive effort required to model pavement voids using the Corps of Engineers gyratory testing machine.

    DOT National Transportation Integrated Search

    1997-11-01

    Various agencies have used the Corps of Engineers gyratory testing machine (GTM) to design and test asphalt mixes. Materials properties such as shear strength and strain are measured during the compaction process. However, a compaction process duplic...

  6. The Practice of Information Processing Model in the Teaching of Cognitive Strategies

    ERIC Educational Resources Information Center

    Ozel, Ali

    2009-01-01

    In this research, the differentiation condition of teaching the learning strategies depending on the time which the first grade of primary school teachers carried out to form an information-process skeleton on student is tried to be found out. This process including the efforts of 260 teachers in this direction consists of whether the adequate…

  7. Effects of Individual Differences and Job Characteristics on the Psychological Health of Italian Nurses

    PubMed Central

    Zurlo, Maria Clelia; Vallone, Federica; Smith, Andrew P.

    2018-01-01

    The Demand Resources and Individual Effects Model (DRIVE Model) is a transactional model that integrates Demands- Control-Support and Effort-Reward Imbalance models emphasising the role of individual (Coping Strategies; Overcommitment) and job characteristics (Job Demands, Social Support, Decision Latitude, Skill Discretion, Effort, Rewards) in the work-related stress process. The present study aimed to test the DRIVE Model in a sample of 450 Italian nurses and to compare findings with those of a study conducted in a sample of UK nurses. A questionnaire composed of Ways of Coping Checklist-Revised (WCCL-R); Job Content Questionnaire (JCQ); ERI Test; Hospital Anxiety and Depression Scale (HADS) was used. Data supported the application of the DRIVE Model to the Italian context, showing significant associations of the individual characteristics of Problem-focused, Seek Advice and Wishful Thinking coping strategies and the job characteristics of Job Demands, Skill Discretion, Decision Latitude, and Effort with perceived levels of Anxiety and Depression. Effort represented the best predictor for psychological health conditions among Italian nurses, and Social Support significantly moderated the effects of Job Demands on perceived levels of Anxiety. The comparison study showed significant differences in the risk profiles of Italian and UK nurses. Findings were discussed in order to define focused interventions to promote nurses’ wellbeing.

  8. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  9. Developing a Modeling Framework for Ecosystem Forecasting: The Lake Michigan Pilot

    EPA Science Inventory

    Recent multi-party efforts to coordinate modeling activities that support ecosystem management decision-making in the Great Lakes have resulted in the recommendation to convene an interagency working group that will develop a pilot approach for Lake Michigan. The process will br...

  10. A Thermo-Poromechanics Finite Element Model for Predicting Arterial Tissue Fusion

    NASA Astrophysics Data System (ADS)

    Fankell, Douglas P.

    This work provides modeling efforts and supplemental experimental work performed towards the ultimate goal of modeling heat transfer, mass transfer, and deformation occurring in biological tissue, in particular during arterial fusion and cutting. Developing accurate models of these processes accomplishes two goals. First, accurate models would enable engineers to design devices to be safer and less expensive. Second, the mechanisms behind tissue fusion and cutting are widely unknown; models with the ability to accurately predict physical phenomena occurring in the tissue will allow for insight into the underlying mechanisms of the processes. This work presents three aims and the efforts in achieving them, leading to an accurate model of tissue fusion and more broadly the thermo-poromechanics (TPM) occurring within biological tissue. Chapters 1 and 2 provide the motivation for developing accurate TPM models of biological tissue and an overview of previous modeling efforts. In Chapter 3, a coupled thermo-structural finite element (FE) model with the ability to predict arterial cutting is offered. From the work presented in Chapter 3, it became obvious a more detailed model was needed. Chapter 4 meets this need by presenting small strain TPM theory and its implementation in an FE code. The model is then used to simulate thermal tissue fusion. These simulations show the model's promise in predicting the water content and temperature of arterial wall tissue during the fusion process, but it is limited by its small deformation assumptions. Chapters 5-7 attempt to address this limitation by developing and implementing a large deformation TPM FE model. Chapters 5, 6, and 7 present a thermodynamically consistent, large deformation TPM FE model and its ability to simulate tissue fusion. Ultimately, this work provides several methods of simulating arterial tissue fusion and the thermo-poromechanics of biological tissue. It is the first work, to the author's knowledge, to simulate the fully coupled TPM of biological tissue and the first to present a fully coupled large deformation TPM FE model. In doing so, a stepping stone for more advanced modeling of biological tissue has been laid.

  11. Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer-Indian Ocean METOC Imager (GIFTS-IOMI) Hyperspectral Data

    DTIC Science & Technology

    2002-09-30

    Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer-Indian Ocean METOC Imager ( GIFTS -IOMI) Hyperspectral Data...water quality assessment. OBJECTIVES The objective of this DoD research effort is to develop and demonstrate a fully functional GIFTS - IOMI...environment once GIFTS -IOMI is stationed over the Indian Ocean. The system will provide specialized methods for the characterization of the atmospheric

  12. Defects Associated with Soldification of Melt Processed Superalloys for the Aerospace Industry

    DTIC Science & Technology

    2008-07-23

    resulting computational model will be in a form that is usable in their efforts to design new alloys and processing routes. Given the broad research...thermodynamics modeling by Asta and Woodward. The permeability of dendritic arrays in superalloys has been determined using three-dimensional reconstructions of...the solid-liquid mush and finite-element fluid simulations by Pollock and Spowart. Close interaction with industry ensured that computational

  13. Stochastic Models of Human Errors

    NASA Technical Reports Server (NTRS)

    Elshamy, Maged; Elliott, Dawn M. (Technical Monitor)

    2002-01-01

    Humans play an important role in the overall reliability of engineering systems. More often accidents and systems failure are traced to human errors. Therefore, in order to have meaningful system risk analysis, the reliability of the human element must be taken into consideration. Describing the human error process by mathematical models is a key to analyzing contributing factors. Therefore, the objective of this research effort is to establish stochastic models substantiated by sound theoretic foundation to address the occurrence of human errors in the processing of the space shuttle.

  14. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  15. Modeling nuclear processes by Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox softwaremore » that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.« less

  16. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems Among Youth.

    PubMed

    Chen, Diane; Drabick, Deborah A G; Burgers, Darcy E

    2015-12-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed.

  17. Towards Automatic Processing of Virtual City Models for Simulations

    NASA Astrophysics Data System (ADS)

    Piepereit, R.; Schilling, A.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2016-10-01

    Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.

  18. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems among Youth

    PubMed Central

    Chen, Diane; Drabick, Deborah A. G.; Burgers, Darcy E.

    2015-01-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed. PMID:25410430

  19. Finite Element Models for Electron Beam Freeform Fabrication Process

    NASA Technical Reports Server (NTRS)

    Chandra, Umesh

    2012-01-01

    Electron beam freeform fabrication (EBF3) is a member of an emerging class of direct manufacturing processes known as solid freeform fabrication (SFF); another member of the class is the laser deposition process. Successful application of the EBF3 process requires precise control of a number of process parameters such as the EB power, speed, and metal feed rate in order to ensure thermal management; good fusion between the substrate and the first layer and between successive layers; minimize part distortion and residual stresses; and control the microstructure of the finished product. This is the only effort thus far that has addressed computer simulation of the EBF3 process. The models developed in this effort can assist in reducing the number of trials in the laboratory or on the shop floor while making high-quality parts. With some modifications, their use can be further extended to the simulation of laser, TIG (tungsten inert gas), and other deposition processes. A solid mechanics-based finite element code, ABAQUS, was chosen as the primary engine in developing these models whereas a computational fluid dynamics (CFD) code, Fluent, was used in a support role. Several innovative concepts were developed, some of which are highlighted below. These concepts were implemented in a number of new computer models either in the form of stand-alone programs or as user subroutines for ABAQUS and Fluent codes. A database of thermo-physical, mechanical, fluid, and metallurgical properties of stainless steel 304 was developed. Computing models for Gaussian and raster modes of the electron beam heat input were developed. Also, new schemes were devised to account for the heat sink effect during the deposition process. These innovations, and others, lead to improved models for thermal management and prediction of transient/residual stresses and distortions. Two approaches for the prediction of microstructure were pursued. The first was an empirical approach involving the computation of thermal gradient, solidification rate, and velocity (G,R,V) coupled with the use of a solidification map that should be known a priori. The second approach relies completely on computer simulation. For this purpose a criterion for the prediction of morphology was proposed, which was combined with three alternative models for the prediction of microstructure; one based on solidification kinetics, the second on phase diagram, and the third on differential scanning calorimetry data. The last was found to be the simplest and the most versatile; it can be used with multicomponent alloys and rapid solidification without any additional difficulty. For the purpose of (limited) experimental validation, finite element models developed in this effort were applied to three different shapes made of stainless steel 304 material, designed expressly for this effort with an increasing level of complexity. These finite element models require large computation time, especially when applied to deposits with multiple adjacent beads and layers. This problem can be overcome, to some extent, by the use of fast, multi-core computers. Also, due to their numerical nature coupled with the fact that solid mechanics- based models are being used to represent the material behavior in liquid and vapor phases as well, the models have some inherent approximations that become more pronounced when dealing with multi-bead and multi-layer deposits.

  20. Modeling of Powder Bed Manufacturing Defects

    NASA Astrophysics Data System (ADS)

    Mindt, H.-W.; Desmaison, O.; Megahed, M.; Peralta, A.; Neumann, J.

    2018-01-01

    Powder bed additive manufacturing offers unmatched capabilities. The deposition resolution achieved is extremely high enabling the production of innovative functional products and materials. Achieving the desired final quality is, however, hampered by many potential defects that have to be managed in due course of the manufacturing process. Defects observed in products manufactured via powder bed fusion have been studied experimentally. In this effort we have relied on experiments reported in the literature and—when experimental data were not sufficient—we have performed additional experiments providing an extended foundation for defect analysis. There is large interest in reducing the effort and cost of additive manufacturing process qualification and certification using integrated computational material engineering. A prerequisite is, however, that numerical methods can indeed capture defects. A multiscale multiphysics platform is developed and applied to predict and explain the origin of several defects that have been observed experimentally during laser-based powder bed fusion processes. The models utilized are briefly introduced. The ability of the models to capture the observed defects is verified. The root cause of the defects is explained by analyzing the numerical results thus confirming the ability of numerical methods to provide a foundation for rapid process qualification.

  1. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  2. The study on knowledge transferring incentive for information system requirement development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yang

    2015-03-10

    Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.

  3. Studies of Solar Wind Interaction and Ionospheric Processes at Venus and Mars

    NASA Technical Reports Server (NTRS)

    Bogan, Denis (Technical Monitor); Nagy, Andrew F.

    2003-01-01

    This is the final report summarizing the work done during the last three years under NASA Grant NAG5-8946. Our efforts centered on a systematic development of a new generation of three dimensional magneto-hydrodynamic (MHD) numerical code, which models the interaction processes of the solar wind or fast flowing magnetospheric plasma with 'non-magnetic' solar system bodies (e.g. Venus, Mars, Europa, Titan). We have also worked on a number of different, more specific and discrete studies, as various opportunities arose. In the next few pages we briefly summarize these efforts.

  4. Terrestrial implications of mathematical modeling developed for space biomedical research

    NASA Technical Reports Server (NTRS)

    Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini

    1988-01-01

    This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.

  5. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  6. Competition-Based Learning: A Model for the Integration of Competitions with Project-Based Learning Using Open Source LMS

    ERIC Educational Resources Information Center

    Issa, Ghassan; Hussain, Shakir M.; Al-Bahadili, Hussein

    2014-01-01

    In an effort to enhance the learning process in higher education, a new model for Competition-Based Learning (CBL) is presented. The new model utilizes two well-known learning models, namely, the Project-Based Learning (PBL) and competitions. The new model is also applied in a networked environment with emphasis on collective learning as well as…

  7. A Cognitive Ecological Model of Women’s Response to Male Sexual Coercion in Dating

    PubMed Central

    Nurius, Paula S.; Norris, Jeanette

    2015-01-01

    SUMMARY We offer a theoretical model that consolidates background, environmental, and intrapersonal variables related to women’s experience of sexual coercion in dating into a coherent ecological framework and present for the first time a cognitive analysis of the processes women use to formulate responses to sexual coercion. An underlying premise for this model is that a woman’s coping response to sexual coercion by an acquaintance is mediated through cognitive processing of background and situational influences. Because women encounter this form of sexual coercion in the context of relationships and situations that they presume will follow normative expectations (e.g., about making friends, socializing and dating), it is essential to consider normative processes of learning, cognitive mediation, and coping guiding their efforts to interpret and respond to this form of personal threat. Although acts of coercion unquestionably remain the responsibility of the perpetrator, a more complete understanding of the multilevel factors shaping women’s perception of and response to threats can strengthen future inquiry and prevention efforts. PMID:25729157

  8. Simulating Mercury And Methyl Mercury Stream Concentrations At Multiple Scales in a Wetland Influenced Coastal Plain Watershed (McTier Creek, SC, USA)

    EPA Science Inventory

    Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...

  9. Predicting field weed emergence with empirical models and soft computing techniques

    USDA-ARS?s Scientific Manuscript database

    Seedling emergence is the most important phenological process that influences the success of weed species; therefore, predicting weed emergence timing plays a critical role in scheduling weed management measures. Important efforts have been made in the attempt to develop models to predict seedling e...

  10. Experimental comparison of residual stresses for a thermomechanical model for the simulation of selective laser melting

    DOE PAGES

    Hodge, N. E.; Ferencz, R. M.; Vignes, R. M.

    2016-05-30

    Selective laser melting (SLM) is an additive manufacturing process in which multiple, successive layers of metal powders are heated via laser in order to build a part. Modeling of SLM requires consideration of the complex interaction between heat transfer and solid mechanics. Here, the present work describes the authors initial efforts to validate their first generation model. In particular, the comparison of model-generated solid mechanics results, including both deformation and stresses, is presented. Additionally, results of various perturbations of the process parameters and modeling strategies are discussed.

  11. Tracking moving identities: after attending the right location, the identity does not come for free.

    PubMed

    Pinto, Yaïr; Scholte, H Steven; Lamme, V A F

    2012-01-01

    Although tracking identical moving objects has been studied since the 1980's, only recently the study into tracking moving objects with distinct identities has started (referred to as Multiple Identity Tracking, MIT). So far, only behavioral studies into MIT have been undertaken. These studies have left a fundamental question regarding MIT unanswered, is MIT a one-stage or a two-stage process? According to the one-stage model, after a location has been attended, the identity is released without effort. However, according to the two-stage model, there are two effortful stages in MIT, attending to a location, and attending to the identity of the object at that location. In the current study we investigated this question by measuring brain activity in response to tracking familiar and unfamiliar targets. Familiarity is known to automate effortful processes, so if attention to identify the object is needed, this should become easier. However, if no such attention is needed, familiarity can only affect other processes (such as memory for the target set). Our results revealed that on unfamiliar trials neural activity was higher in both attentional networks, and visual identification networks. These results suggest that familiarity in MIT automates attentional identification processes, thus suggesting that attentional identification is needed in MIT. This then would imply that MIT is essentially a two-stage process, since after attending the location, the identity does not seem to come for free.

  12. Discrimination of correlated and entangling quantum channels with selective process tomography

    DOE PAGES

    Dumitrescu, Eugene; Humble, Travis S.

    2016-10-10

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  13. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  14. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  15. Creating a stage-based deterministic PVA model - the western prairie fringed orchid [Exercise 12

    Treesearch

    Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke

    2003-01-01

    Contemporary efforts to conserve populations and species often employ population viability analysis (PVA), a specific application of population modeling that estimates the effects of environmental and demographic processes on population growth rates. These models can also be used to estimate probabilities that a population will fall below a certain level. This...

  16. Comparing Data Input Requirements of Statistical vs. Process-based Watershed Models Applied for Prediction of Fecal Indicator and Pathogen Levels in Recreational Beaches

    EPA Science Inventory

    Same day prediction of fecal indicator bacteria (FIB) concentrations and bather protection from the risk of exposure to pathogens are two important goals of implementing a modeling program at recreational beaches. Sampling efforts for modelling applications can be expensive and t...

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrescu, Eugene; Humble, Travis S.

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  18. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  19. ProcessGene-Connect: SOA Integration between Business Process Models and Enactment Transactions of Enterprise Software Systems

    NASA Astrophysics Data System (ADS)

    Wasser, Avi; Lincoln, Maya

    In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.

  20. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Information Technology and Business Process Redesign | MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu...links systems management to process execution Three Phases/ Multi-Year Effort (This Phase) Literature review Model development— Formal and...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  1. Systematic errors in Monsoon simulation: importance of the equatorial Indian Ocean processes

    NASA Astrophysics Data System (ADS)

    Annamalai, H.; Taguchi, B.; McCreary, J. P., Jr.; Nagura, M.; Miyama, T.

    2015-12-01

    H. Annamalai1, B. Taguchi2, J.P. McCreary1, J. Hafner1, M. Nagura2, and T. Miyama2 International Pacific Research Center, University of Hawaii, USA Application Laboratory, JAMSTEC, Japan In climate models, simulating the monsoon precipitation climatology remains a grand challenge. Compared to CMIP3, the multi-model-mean (MMM) errors for Asian-Australian monsoon (AAM) precipitation climatology in CMIP5, relative to GPCP observations, have shown little improvement. One of the implications is that uncertainties in the future projections of time-mean changes to AAM rainfall may not have reduced from CMIP3 to CMIP5. Despite dedicated efforts by the modeling community, the progress in monsoon modeling is rather slow. This leads us to wonder: Has the scientific community reached a "plateau" in modeling mean monsoon precipitation? Our focus here is to better understanding of the coupled air-sea interactions, and moist processes that govern the precipitation characteristics over the tropical Indian Ocean where large-scale errors persist. A series idealized coupled model experiments are performed to test the hypothesis that errors in the coupled processes along the equatorial Indian Ocean during inter-monsoon seasons could potentially influence systematic errors during the monsoon season. Moist static energy budget diagnostics has been performed to identify the leading moist and radiative processes that account for the large-scale errors in the simulated precipitation. As a way forward, we propose three coordinated efforts, and they are: (i) idealized coupled model experiments; (ii) process-based diagnostics and (iii) direct observations to constrain model physics. We will argue that a systematic and coordinated approach in the identification of the various interactive processes that shape the precipitation basic state needs to be carried out, and high-quality observations over the data sparse monsoon region are needed to validate models and further improve model physics.

  2. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  3. NAME Modeling and Climate Process Team

    NASA Astrophysics Data System (ADS)

    Schemm, J. E.; Williams, L. N.; Gutzler, D. S.

    2007-05-01

    NAME Climate Process and Modeling Team (CPT) has been established to address the need of linking climate process research to model development and testing activities for warm season climate prediction. The project builds on two existing NAME-related modeling efforts. One major component of this project is the organization and implementation of a second phase of NAMAP, based on the 2004 season. NAMAP2 will re-examine the metrics proposed by NAMAP, extend the NAMAP analysis to transient variability, exploit the extensive observational database provided by NAME 2004 to analyze simulation targets of special interest, and expand participation. Vertical column analysis will bring local NAME observations and model outputs together in a context where key physical processes in the models can be evaluated and improved. The second component builds on the current NAME-related modeling effort focused on the diurnal cycle of precipitation in several global models, including those implemented at NCEP, NASA and GFDL. Our activities will focus on the ability of the operational NCEP Global Forecast System (GFS) to simulate the diurnal and seasonal evolution of warm season precipitation during the NAME 2004 EOP, and on changes to the treatment of deep convection in the complicated terrain of the NAMS domain that are necessary to improve the simulations, and ultimately predictions of warm season precipitation These activities will be strongly tied to NAMAP2 to ensure technology transfer from research to operations. Results based on experiments conducted with the NCEP CFS GCM will be reported at the conference with emphasis on the impact of horizontal resolution in predicting warm season precipitation over North America.

  4. Activational and effort-related aspects of motivation: neural mechanisms and implications for psychopathology.

    PubMed

    Salamone, John D; Yohn, Samantha E; López-Cruz, Laura; San Miguel, Noemí; Correa, Mercè

    2016-05-01

    Motivation has been defined as the process that allows organisms to regulate their internal and external environment, and control the probability, proximity and availability of stimuli. As such, motivation is a complex process that is critical for survival, which involves multiple behavioural functions mediated by a number of interacting neural circuits. Classical theories of motivation suggest that there are both directional and activational aspects of motivation, and activational aspects (i.e. speed and vigour of both the instigation and persistence of behaviour) are critical for enabling organisms to overcome work-related obstacles or constraints that separate them from significant stimuli. The present review discusses the role of brain dopamine and related circuits in behavioural activation, exertion of effort in instrumental behaviour, and effort-related decision-making, based upon both animal and human studies. Impairments in behavioural activation and effort-related aspects of motivation are associated with psychiatric symptoms such as anergia, fatigue, lassitude and psychomotor retardation, which cross multiple pathologies, including depression, schizophrenia, and Parkinson's disease. Therefore, this review also attempts to provide an interdisciplinary approach that integrates findings from basic behavioural neuroscience, behavioural economics, clinical neuropsychology, psychiatry, and neurology, to provide a coherent framework for future research and theory in this critical field. Although dopamine systems are a critical part of the brain circuitry regulating behavioural activation, exertion of effort, and effort-related decision-making, mesolimbic dopamine is only one part of a distributed circuitry that includes multiple neurotransmitters and brain areas. Overall, there is a striking similarity between the brain areas involved in behavioural activation and effort-related processes in rodents and in humans. Animal models of effort-related decision-making are highly translatable to humans, and an emerging body of evidence indicates that alterations in effort-based decision-making are evident in several psychiatric and neurological disorders. People with major depression, schizophrenia, and Parkinson's disease show evidence of decision-making biases towards a lower exertion of effort. Translational studies linking research with animal models, human volunteers, and clinical populations are greatly expanding our knowledge about the neural basis of effort-related motivational dysfunction, and it is hoped that this research will ultimately lead to improved treatment for motivational and psychomotor symptoms in psychiatry and neurology. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. ExMC Work Prioritization Process

    NASA Technical Reports Server (NTRS)

    Simon, Matthew

    2015-01-01

    Last year, NASA's Human Research Program (HRP) introduced the concept of a "Path to Risk Reduction" (PRR), which will provide a roadmap that shows how the work being done within each HRP element can be mapped to reducing or closing exploration risks. Efforts are currently underway within the Exploration Medical Capability (ExMC) Element to develop a structured, repeatable process for prioritizing work utilizing decision analysis techniques and risk estimation tools. The goal of this effort is to ensure that the work done within the element maximizes risk reduction for future exploration missions in a quantifiable way and better aligns with the intent and content of the Path to Risk Reduction. The Integrated Medical Model (IMM) will be used to identify those conditions that are major contributors of medical risk for a given design reference mission. For each of these conditions, potential prevention, screening, diagnosis, and treatment methods will be identified. ExMC will then aim to prioritize its potential investments in these mitigation methods based upon their potential for risk reduction and other factors such as vehicle performance impacts, near term schedule needs, duplication with external efforts, and cost. This presentation will describe the process developed to perform this prioritization and inform investment discussions in future element planning efforts. It will also provide an overview of the required input information, types of process participants, figures of merit, and the expected outputs of the process.

  6. Diagnostic Evaluation of Ozone Production and Horizontal Transport in a Regional Photochemical Air Quality Modeling System

    EPA Science Inventory

    A diagnostic model evaluation effort has been performed to focus on photochemical ozone formation and the horizontal transport process since they strongly impact the temporal evolution and spatial distribution of ozone (O3) within the lower troposphere. Results from th...

  7. DEVELOPMENT AND APPLICATION OF POPULATION MODELS TO SUPPORT EPA'S ECOLOGICAL RISK ASSESSMENT PROCESSES FOR PESTICIDES

    EPA Science Inventory

    As part of a broader exploratory effort to develop ecological risk assessment approaches to estimate potential chemical effects on non-target populations, we describe an approach for developing simple population models to estimate the extent to which acute effects on individual...

  8. The Strategic Assessment Model.

    ERIC Educational Resources Information Center

    Glazner, Steve, Ed.

    This book presents six papers focusing on the application of the strategic assessment model (SAM) to the management of higher education facilities. The papers are part of an ongoing effort by the Association of Higher Education Facilities Officers to provide comparative cost and staffing information and to develop a benchmarking process. The…

  9. Tying Resource Allocation and TQM into Planning and Assessment Efforts.

    ERIC Educational Resources Information Center

    Mullendore, Richard H.; Wang, Li-Shing

    1996-01-01

    Describes the evolution of a model, developed by student affairs officials, which outlines a planning process for implementing Total Quality Management. Presents step-by-step instructions for the model's deployment and discusses such issues as transitions, planning forms, goals, and professional and personal growth needs. (RJM)

  10. Development of assessment tools to measure organizational support for employee health.

    PubMed

    Golaszewski, Thomas; Barr, Donald; Pronk, Nico

    2003-01-01

    To develop systems that measure and effect organizational support for employee health. Multiple studies and developmental projects were reviewed that show the process of instrument development, metric quality testing, utilization within intervention studies, and prediction modeling efforts. Demographic patterns indicate high support levels and relationships of subsections to various employee health risks. Successes with the initial version have given rise to 2 additional evaluation tools. The availability of these systems illustrates how ecological models can be practically applied. Such efforts contribute to the paradigm shift in worksite health promotion that focuses on the organization as the target of intervention.

  11. Surface interactions relevant to space station contamination problems

    NASA Technical Reports Server (NTRS)

    Dickinson, J. T.

    1988-01-01

    The physical and chemical processes at solid surfaces which can contribute to Space Station contamination problems are reviewed. Suggested areas for experimental studies to provide data to improve contamination modeling efforts are presented.

  12. Strategic Project Management at the NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Lavelle, Jerome P.

    2000-01-01

    This paper describes Project Management at NASA's Kennedy Space Center (KSC) from a strategic perspective. It develops the historical context of the agency and center's strategic planning process and illustrates how now is the time for KSC to become a center which has excellence in project management. The author describes project management activities at the center and details observations on those efforts. Finally the author describes the Strategic Project Management Process Model as a conceptual model which could assist KSC in defining an appropriate project management process system at the center.

  13. On the use of fractional order PK-PD models

    NASA Astrophysics Data System (ADS)

    Ionescu, Clara; Copot, Dana

    2017-01-01

    Quantifying and controlling depth of anesthesia is a challenging process due to lack of measurement technology for direct effects of drug supply into the body. Efforts are being made to develop new sensor techniques and new horizons are explored for modeling this intricate process. This paper introduces emerging tools available on the ‘engineering market’ imported from the area of fractional calculus. A novel interpretation of the classical drug-effect curve is given, enabling linear control. This enables broadening the horizon of signal processing and control techniques and suggests future research lines.

  14. The pharmacology of effort-related choice behavior: Dopamine, depression, and individual differences.

    PubMed

    Salamone, John D; Correa, Merce; Yohn, Samantha; Lopez Cruz, Laura; San Miguel, Noemi; Alatorre, Luisa

    2016-06-01

    This review paper is focused upon the involvement of mesolimbic dopamine (DA) and related brain systems in effort-based processes. Interference with DA transmission affects instrumental behavior in a manner that interacts with the response requirements of the task, such that rats with impaired DA transmission show a heightened sensitivity to ratio requirements. Impaired DA transmission also affects effort-related choice behavior, which is assessed by tasks that offer a choice between a preferred reinforcer that has a high work requirement vs. less preferred reinforcer that can be obtained with minimal effort. Rats and mice with impaired DA transmission reallocate instrumental behavior away from food-reinforced tasks with high response costs, and show increased selection of low reinforcement/low cost options. Tests of effort-related choice have been developed into models of pathological symptoms of motivation that are seen in disorders such as depression and schizophrenia. These models are being employed to explore the effects of conditions associated with various psychopathologies, and to assess drugs for their potential utility as treatments for effort-related symptoms. Studies of the pharmacology of effort-based choice may contribute to the development of treatments for symptoms such as psychomotor slowing, fatigue or anergia, which are seen in depression and other disorders. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Error-associated behaviors and error rates for robotic geology

    NASA Technical Reports Server (NTRS)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  16. Strategic and Market Analysis | Bioenergy | NREL

    Science.gov Websites

    recent efforts in comparative techno-economic analysis. Our analysis considers a wide range of conversion Intermediates NREL has developed first-of-its-kind process models and economic assessments of the co-processing work strives to understand the economic incentives, technical risks, and key data gaps that need to be

  17. Implementation of School Instructional Improvement and Student Growth in Math: Testing a Multilevel Longitudinal Model

    ERIC Educational Resources Information Center

    Takanishi, Stacey M.

    2012-01-01

    NCLB policies in the United States focus schools' efforts on implementing effective instructional processes to improve student outcomes. This study looks more specifically at how schools are perceived to be implementing state required curricula and benchmarks and developing teaching and learning processes that support the teaching of state…

  18. The Development from Effortful to Automatic Processing in Mathematical Cognition.

    ERIC Educational Resources Information Center

    Kaye, Daniel B.; And Others

    This investigation capitalizes upon the information processing models that depend upon measurement of latency of response to a mathematical problem and the decomposition of reaction time (RT). Simple two term addition problems were presented with possible solutions for true-false verification, and accuracy and RT to response were recorded. Total…

  19. The multiple resource inventory decision-making process

    Treesearch

    Victor A. Rudis

    1993-01-01

    A model of the multiple resource inventory decision-making process is presented that identifies steps in conducting inventories, describes the infrastructure, and points out knowledge gaps that are common to many interdisciplinary studies.Successful efforts to date suggest the need to bridge the gaps by sharing elements, maintain dialogue among stakeholders in multiple...

  20. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  1. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  2. Hydrologic Source Term Processes and Models for the Clearwater and Wineskin Tests, Rainier Mesa, Nevada National Security Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carle, Steven F.

    2011-05-04

    This report describes the development, processes, and results of a hydrologic source term (HST) model for the CLEARWATER (U12q) and WINESKIN (U12r) tests located on Rainier Mesa, Nevada National Security Site, Nevada (Figure 1.1). Of the 61 underground tests (involving 62 unique detonations) conducted on Rainier Mesa (Area 12) between 1957 and 1992 (USDOE, 2015), the CLEARWATER and WINESKIN tests present many unique features that warrant a separate HST modeling effort from other Rainier Mesa tests.

  3. ROAD MAP FOR DEVELOPMENT OF CRYSTAL-TOLERANT HIGH LEVEL WASTE GLASSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, K.; Peeler, D.; Herman, C.

    The U.S. Department of Energy (DOE) is building a Tank Waste Treatment and Immobilization Plant (WTP) at the Hanford Site in Washington to remediate 55 million gallons of radioactive waste that is being temporarily stored in 177 underground tanks. Efforts are being made to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product quality requirements. This road map guides the research and development for formulation and processing of crystaltolerant glasses, identifying near- and long-term activities that need to be completed over the period from 2014 to 2019. The primary objectivemore » is to maximize waste loading for Hanford waste glasses without jeopardizing melter operation by crystal accumulation in the melter or melter discharge riser. The potential applicability to the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) will also be addressed in this road map. The planned research described in this road map is motivated by the potential for substantial economic benefits (significant reductions in glass volumes) that will be realized if the current constraints (T1% for WTP and TL for DWPF) are approached in an appropriate and technically defensible manner for defense waste and current melter designs. The basis of this alternative approach is an empirical model predicting the crystal accumulation in the WTP glass discharge riser and melter bottom as a function of glass composition, time, and temperature. When coupled with an associated operating limit (e.g., the maximum tolerable thickness of an accumulated layer of crystals), this model could then be integrated into the process control algorithms to formulate crystal-tolerant high-level waste (HLW) glasses targeting high waste loadings while still meeting process related limits and melter lifetime expectancies. The modeling effort will be an iterative process, where model form and a broader range of conditions, e.g., glass composition and temperature, will evolve as additional data on crystal accumulation are gathered. Model validation steps will be included to guide the development process and ensure the value of the effort (i.e., increased waste loading and waste throughput). A summary of the stages of the road map for developing the crystal-tolerant glass approach, their estimated durations, and deliverables is provided.« less

  4. First-principles modeling of laser-matter interaction and plasma dynamics in nanosecond pulsed laser shock processing

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang

    2018-02-01

    Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.

  5. From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models

    PubMed Central

    Zhu, Hao

    2017-01-01

    Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837

  6. Toward a New Predictive Model of Student Retention in Higher Education: An Application of Classical Sociological Theory

    ERIC Educational Resources Information Center

    Kerby, Molly B.

    2015-01-01

    Theoretical models designed to predict whether students will persist or not have been valuable tools for retention efforts relative to the creation of services in academic and student affairs. Some of the early models attempted to explain and measure factors in the "college dropout process." For example, in his seminal work, Tinto…

  7. A Modeling Approach to Fiber Fracture in Melt Impregnation

    NASA Astrophysics Data System (ADS)

    Ren, Feng; Zhang, Cong; Yu, Yang; Xin, Chunling; Tang, Ke; He, Yadong

    2017-02-01

    The effect of process variables such as roving pulling speed, melt temperature and number of pins on the fiber fracture during the processing of thermoplastic based composites was investigated in this study. The melt impregnation was used in this process of continuous glass fiber reinforced thermoplastic composites. Previous investigators have suggested a variety of models for melt impregnation, while comparatively little effort has been spent on modeling the fiber fracture caused by the viscous resin. Herein, a mathematical model was developed for impregnation process to predict the fiber fracture rate and describe the experimental results with the Weibull intensity distribution function. The optimal parameters of this process were obtained by orthogonal experiment. The results suggest that the fiber fracture is caused by viscous shear stress on fiber bundle in melt impregnation mold when pulling the fiber bundle.

  8. Vegetation Demographics in Earth System Models: a review of progress and priorities

    DOE PAGES

    Fisher, Rosie A.; Koven, Charles D.; Anderegg, William R. L.; ...

    2017-09-18

    Numerous current efforts seek to improve the representation of ecosystem ecology and vegetation demographic processes within Earth System Models (ESMs). Furthermore, these developments are widely viewed as an important step in developing greater realism in predictions of future ecosystem states and fluxes. Increased realism, however, leads to increased model complexity, with new features raising a suite of ecological questions that require empirical constraints. We review the developments that permit the representation of plant demographics in ESMs, and identify issues raised by these developments that highlight important gaps in ecological understanding. These issues inevitably translate into uncertainty in model projections butmore » also allow models to be applied to new processes and questions concerning the dynamics of real-world ecosystems. We also argue that stronger and more innovative connections to data, across the range of scales considered, are required to address these gaps in understanding. The development of first-generation land surface models as a unifying framework for ecophysiological understanding stimulated much research into plant physiological traits and gas exchange. Constraining predictions at ecologically relevant spatial and temporal scales will require a similar investment of effort and intensified inter-disciplinary communication.« less

  9. Vegetation Demographics in Earth System Models: a review of progress and priorities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, Rosie A.; Koven, Charles D.; Anderegg, William R. L.

    Numerous current efforts seek to improve the representation of ecosystem ecology and vegetation demographic processes within Earth System Models (ESMs). Furthermore, these developments are widely viewed as an important step in developing greater realism in predictions of future ecosystem states and fluxes. Increased realism, however, leads to increased model complexity, with new features raising a suite of ecological questions that require empirical constraints. We review the developments that permit the representation of plant demographics in ESMs, and identify issues raised by these developments that highlight important gaps in ecological understanding. These issues inevitably translate into uncertainty in model projections butmore » also allow models to be applied to new processes and questions concerning the dynamics of real-world ecosystems. We also argue that stronger and more innovative connections to data, across the range of scales considered, are required to address these gaps in understanding. The development of first-generation land surface models as a unifying framework for ecophysiological understanding stimulated much research into plant physiological traits and gas exchange. Constraining predictions at ecologically relevant spatial and temporal scales will require a similar investment of effort and intensified inter-disciplinary communication.« less

  10. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  11. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. Arctic Research NASA's Cryospheric Sciences Program

    NASA Technical Reports Server (NTRS)

    Waleed, Abdalati; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    Much of NASA's Arctic Research is run through its Cryospheric Sciences Program. Arctic research efforts to date have focused primarily on investigations of the mass balance of the largest Arctic land-ice masses and the mechanisms that control it, interactions among sea ice, polar oceans, and the polar atmosphere, atmospheric processes in the polar regions, energy exchanges in the Arctic. All of these efforts have been focused on characterizing, understanding, and predicting, changes in the Arctic. NASA's unique vantage from space provides an important perspective for the study of these large scale processes, while detailed process information is obtained through targeted in situ field and airborne campaigns and models. An overview of NASA investigations in the Arctic will be presented demonstrating how the synthesis of space-based technology, and these complementary components have advanced our understanding of physical processes in the Arctic.

  14. Accuracy Quantification of the Loci-CHEM Code for Chamber Wall Heat Transfer in a GO2/GH2 Single Element Injector Model Problem

    NASA Technical Reports Server (NTRS)

    West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin

    2006-01-01

    A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci-CHEM CFD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid was used and then locally refined to demonstrate grid convergence. Solutions were obtained with three variations of the k-omega turbulence model.

  15. Accuracy Quantification of the Loci-CHEM Code for Chamber Wall Heat Fluxes in a G02/GH2 Single Element Injector Model Problem

    NASA Technical Reports Server (NTRS)

    West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin

    2006-01-01

    A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci- CHEM CPD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid grid was used and then locally refined to demonstrate grid convergence. Solutions were also obtained with three variations of the k-omega turbulence model.

  16. Strategies for developing competency models.

    PubMed

    Marrelli, Anne F; Tondora, Janis; Hoge, Michael A

    2005-01-01

    There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.

  17. The Earth Microbiome Project and modeling the planets microbial potential (Invited)

    NASA Astrophysics Data System (ADS)

    Gilbert, J. A.

    2013-12-01

    The understanding of Earth's climate and ecology requires multiscale observations of the biosphere, of which microbial life are a major component. However, to acquire and process physical samples of soil, water and air that comprise the appropriate spatial and temporal resolution to capture the immense variation in microbial dynamics, would require a herculean effort and immense financial resources dwarfing even the most ambitious projects to date. To overcome this hurdle we created the Earth Microbiome Project, a crowd-sourced effort to acquire physical samples from researchers around the world that are, importantly, contextualized with physical, chemical and biological data detailing the environmental properties of that sample in the location and time it was acquired. The EMP leverages these existing efforts to target a systematic analysis of microbial taxonomic and functional dynamics across a vast array of environmental parameter gradients. The EMP captures the environmental gradients, location, time and sampling protocol information about every sample donated by our valued collaborators. Physical samples are then processed using a standardized DNA extraction, PCR, and shotgun sequencing protocol to generate comparable data regarding the microbial community structure and function in each sample. To date we have processed >17,000 samples from 40 different biomes. One of the key goals of the EMP is to map the spatiotemporal variability of microbial communities to capture the changes in important functional processes that need to be appropriately expressed in models to provide reliable forecasts of ecosystem phenotype across our changing planet. This is essential if we are to develop economically sound strategies to be good stewards of our Earth. The EMP recognizes that environments are comprised of complex sets of interdependent parameters and that the development of useful predictive computational models of both terrestrial and atmospheric systems requires recognition and accommodation of sources of uncertainty.

  18. A Developmental Model for Educational Planning: Democratic Rationalities and Dispositions

    ERIC Educational Resources Information Center

    Hess, Michael; Johnson, Jerry; Reynolds, Sharon

    2014-01-01

    The Developmental Democratic Planning (DDP) model frames educational planning as a process that extends beyond the immediate focus of a particular planning effort to acknowledge and cultivate the potential of all members of the organization to fulfill their roles as active participants in the democratic life of the organization. The DDP model…

  19. Implementing the Indiana Model. Indiana Leadership Consortium: Equity through Change.

    ERIC Educational Resources Information Center

    Indiana Leadership Consortium.

    This guide, which was developed as a part of a multi-year, statewide effort to institutionalize gender equity in various educational settings throughout Indiana, presents a step-by-step process model for achieving gender equity in the state's secondary- and postsecondary-level vocational programs through coalition building and implementation of a…

  20. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  1. REMOTE SENSING AND SPATIALLY EXPLICIT LANDSCAPE-BASED NITROGEN MODELING METHODS DEVELOPMENT IN THE NEUSE RIVER BASIN, NC

    EPA Science Inventory

    The objective of this research was to model and map the spatial patterns of excess nitrogen (N) sources across the landscape within the Neuse River Basin (NRB) of North
    Carolina. The process included an initial land cover characterization effort to map landscape "patches" at ...

  2. Stevensville West Central Study

    Treesearch

    J. G. Jones; J. D. Chew; N. K. Christianson; D. J. Silvieus; C. A. Stewart

    2000-01-01

    This paper reports on an application of two modeling systems in the assessment and planning effort for a 58,038-acre area on the Bitterroot National Forest: SIMulating Vegetative Patterns and Processes at Landscape ScaLEs (SIMPPLLE), and Multi-resource Analysis and Geographic Information System (MAGIS). SIMPPLLE was a useful model for tracking and analyzing an...

  3. Space biology initiative program definition review. Trade study 1: Automation costs versus crew utilization

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.

    1989-01-01

    A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.

  4. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  5. A Large-Scale, High-Resolution Hydrological Model Parameter Data Set for Climate Change Impact Assessment for the Conterminous US

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim

    2014-01-01

    To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less

  6. Modeling Hydrologic Processes after Vegetation Restoration in an Urban Watershed with HEC-HMS

    NASA Astrophysics Data System (ADS)

    Stevenson, K.; Kinoshita, A. M.

    2017-12-01

    The San Diego River Watershed in California (USA) is highly urbanized, where stream channel geomorphology are directly affected by anthropogenic disturbances. Flooding and water quality concerns have led to an increased interest in improving the condition of urban waterways. Alvarado Creek, a 1200-meter section of a tributary to the San Diego River will be used as a case study to understand the degree to which restoration efforts reduce the impacts of climate change and anthropogenic activities on hydrologic processes and water quality in urban stream ecosystems. In 2016, non-native vegetation (i.e. Washingtonia spp. (fan palm), Phoenix canariensis (Canary Island palm)) and approximately 7257 kilograms of refuse were removed from the study reach. This research develops the United States Army Corp of Engineers Hydrologic Engineering Center's Hydraulic Modeling System (USACE HEC-HMS) using field-based data to model and predict the short- and long-term impacts of restoration on geomorphic and hydrologic processes. Observations include cross-sectional area, grain-size distributions, water quality, and continuous measurements of streamflow, temperature, and precipitation. Baseline and design storms are simulated before and after restoration. The model will be calibrated and validated using field observations. The design storms represent statistical likelihoods of storms occurrences, and the pre- and post-restoration hydrologic responses will be compared to evaluate the impact of vegetation and waste removal on runoff processes. Ultimately model parameters will be transferred to other urban creeks in San Diego that may potentially undergo restoration. Modeling will be used to learn about the response trajectory of rainfall-runoff processes following restoration efforts in urban streams and guide future management and restoration activities.

  7. Systems thinking: what business modeling can do for public health.

    PubMed

    Williams, Warren; Lyalin, David; Wingo, Phyllis A

    2005-01-01

    Today's public health programs are complex business systems with multiple levels of collaborating federal, state, and local entities. The use of proven systems engineering modeling techniques to analyze, align, and streamline public health operations is in the beginning stages. The authors review the initial business modeling efforts in immunization and cancer registries and present a case to broadly apply business modeling approaches to analyze and improve public health processes.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adam, J. C.; Stephens, J. C.; Chung, Serena

    As managers of agricultural and natural resources are confronted with uncertainties in global change impacts, the complexities associated with the interconnected cycling of nitrogen, carbon, and water present daunting management challenges. Existing models provide detailed information on specific sub-systems (land, air, water, economics, etc). An increasing awareness of the unintended consequences of management decisions resulting from interconnectedness of these sub-systems, however, necessitates coupled regional earth system models (EaSMs). Decision makers’ needs and priorities can be integrated into the model design and development processes to enhance decision-making relevance and "usability" of EaSMs. BioEarth is a current research initiative with a focusmore » on the U.S. Pacific Northwest region that explores the coupling of multiple stand-alone EaSMs to generate usable information for resource decision-making. Direct engagement between model developers and non-academic stakeholders involved in resource and environmental management decisions throughout the model development process is a critical component of this effort. BioEarth utilizes a "bottom-up" approach, upscaling a catchment-scale model to basin and regional scales, as opposed to the "top-down" approach of downscaling global models utilized by most other EaSM efforts. This paper describes the BioEarth initiative and highlights opportunities and challenges associated with coupling multiple stand-alone models to generate usable information for agricultural and natural resource decision-making.« less

  9. Toward a Rational and Mechanistic Account of Mental Effort.

    PubMed

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  10. The Impact of Attention on Judgments of Frequency and Duration

    PubMed Central

    Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter

    2015-01-01

    Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested. PMID:26000712

  11. The impact of attention on judgments of frequency and duration.

    PubMed

    Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter

    2015-01-01

    Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested.

  12. Advancing HIV/AIDS prevention among American Indians through capacity building and the community readiness model.

    PubMed

    Thurman, Pamela Jumper; Vernon, Irene S; Plested, Barbara

    2007-01-01

    Although HIV/AIDS prevention has presented challenges over the past 25 years, prevention does work! To be most effective, however, prevention must be specific to the culture and the nature of the community. Building the capacity of a community for prevention efforts is not an easy process. If capacity is to be sustained, it must be practical and utilize the resources that already exist in the community. Attitudes vary across communities; resources vary, political climates are constantly varied and changing. Communities are fluid-always changing, adapting, growing. They are "ready" for different things at different times. Readiness is a key issue! This article presents a model that has experienced a high level of success in building community capacity for effective prevention/intervention for HIV/AIDS and offers case studies for review. The Community Readiness Model provides both quantitative and qualitative information in a user-friendly structure that guides a community through the process of understanding the importance of the measure of readiness. The model identifies readiness- appropriate strategies, provides readiness scores for evaluation, and most important, involves community stakeholders in the process. The article will demonstrate the importance of developing strategies consistent with readiness levels for more cost-effective and successful prevention efforts.

  13. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  14. Verification of the Skorohod-Olevsky Viscous Sintering (SOVS) Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lester, Brian T.

    2017-11-16

    Sintering refers to a manufacturing process through which mechanically pressed bodies of ceramic (and sometimes metal) powders are heated to drive densification thereby removing the inherit porosity of green bodies. As the body densifies through the sintering process, the ensuing material flow leads to macroscopic deformations of the specimen and as such the final configuration differs form the initial. Therefore, as with any manufacturing step, there is substantial interest in understanding and being able to model the sintering process to predict deformation and residual stress. Efforts in this regard have been pursued for face seals, gear wheels, and consumer productsmore » like wash-basins. To understand the sintering process, a variety of modeling approaches have been pursued at different scales.« less

  15. Employee Participation in Non-Mandatory Professional Development--The Role of Core Proactive Motivation Processes

    ERIC Educational Resources Information Center

    Sankey, Kim S.; Machin, M. Anthony

    2014-01-01

    With a focus on the self-initiated efforts of employees, this study examined a model of core proactive motivation processes for participation in non-mandatory professional development (PD) within a proactive motivation framework using the Self-Determination Theory perspective. A multi-group SEM analysis conducted across 439 academic and general…

  16. Building a Market Simulation to Teach Business Process Analysis: Effects of Realism on Engaged Learning

    ERIC Educational Resources Information Center

    Peng, Jacob; Abdullah, Ira

    2018-01-01

    The emphases of student involvement and meaningful engagement in the learner-centered education model have created a new paradigm in an effort to generate a more engaging learning environment. This study examines the success of using different simulation platforms in creating a market simulation to teach business processes in the accounting…

  17. The Role of Acquired Shared Mental Models in Improving the Process of Team-Based Learning

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; Khalil, Mohammed K.; Spector, J. Michael, Ed.

    2008-01-01

    Working in teams is an important aspect of learning in various educational settings. Although education has embraced instructional strategies that use multiple learners to facilitate learning, the benefits of team-based learning need to be substantiated. There are limited efforts to evaluate the efficacy of learning processes associated with…

  18. Enhancing the Scientific Process with Artificial Intelligence: Forest Science Applications

    Treesearch

    Ronald E. McRoberts; Daniel L. Schmoldt; H. Michael Rauscher

    1991-01-01

    Forestry, as a science, is a process for investigating nature. It consists of repeatedly cycling through a number of steps, including identifying knowledge gaps, creating knowledge to fill them, and organizing, evaluating, and delivering this knowledge. Much of this effort is directed toward creating abstract models of natural phenomena. The cognitive techniques of AI...

  19. Landscape pattern and ecological process in the Sierra Nevada

    Treesearch

    Dean L. Urban

    2004-01-01

    The Sierran Global Change Program in Sequoia-Kings Canyon and Yosemite National Parks includes a nearly decade-long integrated study of the interactions between climate, forest processes, and fire. This study is characterized by three recurring themes: (1) the use of systems-level models as a framework for integration and synthesis, (2) an effort to extrapolate an...

  20. Ni-Co laterite deposits

    USGS Publications Warehouse

    Marsh, Erin E.; Anderson, Eric D.

    2011-01-01

    Nickel-cobalt (Ni-Co) laterite deposits are an important source of nickel (Ni). Currently, there is a decline in magmatic Ni-bearing sulfide lode deposit resources. New efforts to develop an alternative source of Ni, particularly with improved metallurgy processes, make the Ni-Co laterites an important exploration target in anticipation of the future demand for Ni. This deposit model provides a general description of the geology and mineralogy of Ni-Co laterite deposits, and contains discussion of the influences of climate, geomorphology (relief), drainage, tectonism, structure, and protolith on the development of favorable weathering profiles. This model of Ni-Co laterite deposits represents part of the U.S. Geological Survey Mineral Resources Program's effort to update the existing models to be used for an upcoming national mineral resource assessment.

  1. Improving the forecast for biodiversity under climate change.

    PubMed

    Urban, M C; Bocedi, G; Hendry, A P; Mihoub, J-B; Pe'er, G; Singer, A; Bridle, J R; Crozier, L G; De Meester, L; Godsoe, W; Gonzalez, A; Hellmann, J J; Holt, R D; Huth, A; Johst, K; Krug, C B; Leadley, P W; Palmer, S C F; Pantel, J H; Schmitz, A; Zollner, P A; Travis, J M J

    2016-09-09

    New biological models are incorporating the realistic processes underlying biological responses to climate change and other human-caused disturbances. However, these more realistic models require detailed information, which is lacking for most species on Earth. Current monitoring efforts mainly document changes in biodiversity, rather than collecting the mechanistic data needed to predict future changes. We describe and prioritize the biological information needed to inform more realistic projections of species' responses to climate change. We also highlight how trait-based approaches and adaptive modeling can leverage sparse data to make broader predictions. We outline a global effort to collect the data necessary to better understand, anticipate, and reduce the damaging effects of climate change on biodiversity. Copyright © 2016, American Association for the Advancement of Science.

  2. An adaptable architecture for patient cohort identification from diverse data sources.

    PubMed

    Bache, Richard; Miles, Simon; Taweel, Adel

    2013-12-01

    We define and validate an architecture for systems that identify patient cohorts for clinical trials from multiple heterogeneous data sources. This architecture has an explicit query model capable of supporting temporal reasoning and expressing eligibility criteria independently of the representation of the data used to evaluate them. The architecture has the key feature that queries defined according to the query model are both pre and post-processed and this is used to address both structural and semantic heterogeneity. The process of extracting the relevant clinical facts is separated from the process of reasoning about them. A specific instance of the query model is then defined and implemented. We show that the specific instance of the query model has wide applicability. We then describe how it is used to access three diverse data warehouses to determine patient counts. Although the proposed architecture requires greater effort to implement the query model than would be the case for using just SQL and accessing a data-based management system directly, this effort is justified because it supports both temporal reasoning and heterogeneous data sources. The query model only needs to be implemented once no matter how many data sources are accessed. Each additional source requires only the implementation of a lightweight adaptor. The architecture has been used to implement a specific query model that can express complex eligibility criteria and access three diverse data warehouses thus demonstrating the feasibility of this approach in dealing with temporal reasoning and data heterogeneity.

  3. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  4. Participatory System Dynamics Modeling: Increasing Stakeholder Engagement and Precision to Improve Implementation Planning in Systems.

    PubMed

    Zimmerman, Lindsey; Lounsbury, David W; Rosen, Craig S; Kimerling, Rachel; Trafton, Jodie A; Lindley, Steven E

    2016-11-01

    Implementation planning typically incorporates stakeholder input. Quality improvement efforts provide data-based feedback regarding progress. Participatory system dynamics modeling (PSD) triangulates stakeholder expertise, data and simulation of implementation plans prior to attempting change. Frontline staff in one VA outpatient mental health system used PSD to examine policy and procedural "mechanisms" they believe underlie local capacity to implement evidence-based psychotherapies (EBPs) for PTSD and depression. We piloted the PSD process, simulating implementation plans to improve EBP reach. Findings indicate PSD is a feasible, useful strategy for building stakeholder consensus, and may save time and effort as compared to trial-and-error EBP implementation planning.

  5. Dynamics of the job search process: developing and testing a mediated moderation model.

    PubMed

    Sun, Shuhua; Song, Zhaoli; Lim, Vivien K G

    2013-09-01

    Taking a self-regulatory perspective, we develop a mediated moderation model explaining how within-person changes in job search efficacy and chronic regulatory focus interactively affect the number of job interview offers and whether job search effort mediates the cross-level interactive effects. A sample of 184 graduating college students provided monthly reports of their job search activities over a period of 8 months. Findings supported the hypothesized relationships. Specifically, at the within-person level, job search efficacy was positively related with the number of interview offers for job seekers with strong prevention focus and negatively related with the number of interview offers for job seekers with strong promotion focus. Results show that job search effort mediated the moderated relationships. Findings enhance understandings of the complex self-regulatory processes underlying job search. PsycINFO Database Record (c) 2013 APA, all rights reserved

  6. Increasing Minority Research Participation Through Community Organization Outreach

    PubMed Central

    Alvarez, Roger A.; Vasquez, Elias; Mayorga, Carla C.; Feaster, Daniel J.; Mitrani, Victoria B.

    2008-01-01

    Recruitment is one of the most significant challenges in conducting research with ethnic minority populations. Establishing relationships with organizations that serve ethnic minority communities can facilitate recruitment. To create a successful recruitment process, a strategic plan of action is necessary prior to implementing community outreach efforts. For this study population of women who were HIV+ and recovering from substance abuse disorder, the authors found that establishing trust with community organizations that serve these women allows for a productive referral relationship. Although the majority of women in this study are African American, the authors were particularly challenged in recruiting Hispanic women. This article presents a recruitment process model that has facilitated our recruitment efforts and has helped the authors to organize, document, and evaluate their community outreach strategies. This model can be adopted and adapted by nurses and other health researchers to enhance engagement of minority populations. PMID:16829637

  7. Increasing minority research participation through community organization outreach.

    PubMed

    Alvarez, Roger A; Vasquez, Elias; Mayorga, Carla C; Feaster, Daniel J; Mitrani, Victoria B

    2006-08-01

    Recruitment is one of the most significant challenges in conducting research with ethnic minority populations. Establishing relationships with organizations that serve ethnic minority communities can facilitate recruitment. To create a successful recruitment process, a strategic plan of action is necessary prior to implementing community outreach efforts. For this study population of women who were HIV+ and recovering from substance abuse disorder, the authors found that establishing trust with community organizations that serve these women allows for a productive referral relationship. Although the majority of women in this study are African American, the authors were particularly challenged in recruiting Hispanic women. This article presents a recruitment process model that has facilitated our recruitment efforts and has helped the authors to organize, document, and evaluate their community out-reach strategies. This model can be adopted and adapted by nurses and other health researchers to enhance engagement of minority populations.

  8. Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center

    NASA Astrophysics Data System (ADS)

    Berger, Thomas

    2016-07-01

    The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.

  9. Memory mechanisms supporting syntactic comprehension.

    PubMed

    Caplan, David; Waters, Gloria

    2013-04-01

    Efforts to characterize the memory system that supports sentence comprehension have historically drawn extensively on short-term memory as a source of mechanisms that might apply to sentences. The focus of these efforts has changed significantly in the past decade. As a result of changes in models of short-term working memory (ST-WM) and developments in models of sentence comprehension, the effort to relate entire components of an ST-WM system, such as those in the model developed by Baddeley (Nature Reviews Neuroscience 4: 829-839, 2003) to sentence comprehension has largely been replaced by an effort to relate more specific mechanisms found in modern models of ST-WM to memory processes that support one aspect of sentence comprehension--the assignment of syntactic structure (parsing) and its use in determining sentence meaning (interpretation) during sentence comprehension. In this article, we present the historical background to recent studies of the memory mechanisms that support parsing and interpretation and review recent research into this relation. We argue that the results of this research do not converge on a set of mechanisms derived from ST-WM that apply to parsing and interpretation. We argue that the memory mechanisms supporting parsing and interpretation have features that characterize another memory system that has been postulated to account for skilled performance-long-term working memory. We propose a model of the relation of different aspects of parsing and interpretation to ST-WM and long-term working memory.

  10. AISI/DOE Advanced Process Control Program Vol. 3 of 6 Microstructure Engineering in Hot Strip Mills, Part 1 of 2: Integrated Mathematical Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.K. Brimacombe; I.V. Samarasekera; E.B. Hawbolt

    1999-07-31

    This report describes the work of developing an integrated model used to predict the thermal history, deformation, roll forces, microstructural evolution and mechanical properties of steel strip in a hot-strip mill. This achievement results from a joint research effort that is part of the American Iron and Steel Institute's (AIS) Advanced Process Control Program, a collaboration between the U.S. DOE and fifteen North American Steelmakers.

  11. Use of Forest Inventory and Analysis information in wildlife habitat modeling: a process for linking multiple scales

    Treesearch

    Thomas C. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Joshua L. Lawler

    2002-01-01

    We describe our collective efforts to develop and apply methods for using FIA data to model forest resources and wildlife habitat. Our work demonstrates how flexible regression techniques, such as generalized additive models, can be linked with spatially explicit environmental information for the mapping of forest type and structure. We illustrate how these maps of...

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory Reaman

    The initiative will enable the COG Biopathology Center (Biospecimen Repository), the Molecular Genetics Laboratory and other participating reference laboratories to upload large data sets to the eRDES. The capability streamlines data currency and accuracy allowing the centers to export data from local systems and import the defined data to the eRDES. The process will aid in the best practices which have been defined by the Office of Biorepository and Biospecimen Research (OBBR) and the Group Banking Committee (GBC). The initiative allows for batch import and export, a data validation process and reporting mechanism, and a model for other labs tomore » incorporate. All objectives are complete. The solutions provided and the defined process eliminates dual data entry resulting in data consistency. The audit trail capabilities allow for complete tracking of the data exchange between laboratories and the Statistical Data Center (SDC). The impact is directly on time and efforts. In return, the process will save money and improve the data utilized by the COG. Ongoing efforts include implementing new technologies to further enhance the current solutions and process currently in place. Web Services and Reporting Services are technologies that have become industry standards and will allow for further harmonization with caBIG (cancer Biolnforrnatics Grid). Additional testing and implementation of the model for other laboratories is in process.« less

  13. On the use of Empirical Data to Downscale Non-scientific Scepticism About Results From Complex Physical Based Models

    NASA Astrophysics Data System (ADS)

    Germer, S.; Bens, O.; Hüttl, R. F.

    2008-12-01

    The scepticism of non-scientific local stakeholders about results from complex physical based models is a major problem concerning the development and implementation of local climate change adaptation measures. This scepticism originates from the high complexity of such models. Local stakeholders perceive complex models as black-box models, as it is impossible to gasp all underlying assumptions and mathematically formulated processes at a glance. The use of physical based models is, however, indispensible to study complex underlying processes and to predict future environmental changes. The increase of climate change adaptation efforts following the release of the latest IPCC report indicates that the communication of facts about what has already changed is an appropriate tool to trigger climate change adaptation. Therefore we suggest increasing the practice of empirical data analysis in addition to modelling efforts. The analysis of time series can generate results that are easier to comprehend for non-scientific stakeholders. Temporal trends and seasonal patterns of selected hydrological parameters (precipitation, evapotranspiration, groundwater levels and river discharge) can be identified and the dependence of trends and seasonal patters to land use, topography and soil type can be highlighted. A discussion about lag times between the hydrological parameters can increase the awareness of local stakeholders for delayed environment responses.

  14. Modeling of materials supply, demand and prices

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The societal, economic, and policy tradeoffs associated with materials processing and utilization, are discussed. The materials system provides the materials engineer with the system analysis required for formulate sound materials processing, utilization, and resource development policies and strategies. Materials system simulation and modeling research program including assessments of materials substitution dynamics, public policy implications, and materials process economics was expanded. This effort includes several collaborative programs with materials engineers, economists, and policy analysts. The technical and socioeconomic issues of materials recycling, input-output analysis, and technological change and productivity are examined. The major thrust areas in materials systems research are outlined.

  15. Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.

    1991-01-01

    An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.

  16. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  17. Investigating the Role of Mesoscale Processes and Ice Dynamics in Carbon and Iron Fluxes in a Changing Amundsen Sea (INSPIRE)

    NASA Astrophysics Data System (ADS)

    Mu, L.; Yager, P. L.; St-Laurent, P.; Dinniman, M.; Oliver, H.; Stammerjohn, S. E.; Sherrell, R. M.; Hofmann, E. E.

    2016-02-01

    The Amundsen Sea, in the remote S. Pacific sector of the Southern Ocean, is one of the least studied Antarctic continental shelf regions. It shares key processes with other W. Antarctic shelf regions, such as formation of a recurring polynya, important ice shelf-ocean linkages, and high biological production, but has unique characteristics as well. The Amundsen Sea Polynya (ASP), features 1) large intrusions of modified Circumpolar Deep Water (mCDW) onto the continental shelf, 2) the fastest melting ice sheets in Antarctica, 3) the most productive coastal polynya and a large atmospheric CO2 sink, and 4) very rapid declines in seasonal sea ice. Here we report on a new effort for this region that unites independent, state-of-the-art modeling and field data synthesis efforts to address important unanswered questions about carbon fluxes, iron supply, and climate sensitivity in this key region of the coastal Antarctic. Following on the heels of a highly successful oceanographic field program, the Amundsen Sea Polynya International Research Expedition (ASPIRE; which sampled the ASP with high spatial resolution during the onset of the enormous phytoplankton bloom of 2011), the INSPIRE project is a collaboration between ASPIRE senior scientists and an experienced team of physical and biogeochemical modelers who can use ASPIRE field data to both validate and extend the capabilities of an existing Regional Ocean Modeling System (ROMS) for the Amundsen Sea. This new effort will add biology and biogeochemistry (including features potentially unique to the ASP region) to an existing physical model, allowing us to address key questions about bloom mechanisms and climate sensitivity that could not be answered by field campaigns or modeling alone. This project is expected to generate new insights and hypotheses that will ultimately guide sampling strategies of future field efforts investigating how present and future climate change impacts this important region of the world.

  18. Predictive and Prognostic Models: Implications for Healthcare Decision-Making in a Modern Recession

    PubMed Central

    Vogenberg, F. Randy

    2009-01-01

    Various modeling tools have been developed to address the lack of standardized processes that incorporate the perspectives of all healthcare stakeholders. Such models can assist in the decision-making process aimed at achieving specific clinical outcomes, as well as guide the allocation of healthcare resources and reduce costs. The current efforts in Congress to change the way healthcare is financed, reimbursed, and delivered have rendered the incorporation of modeling tools into the clinical decision-making all the more important. Prognostic and predictive models are particularly relevant to healthcare, particularly in the clinical decision-making, with implications for payers, patients, and providers. The use of these models is likely to increase, as providers and patients seek to improve their clinical decision process to achieve better outcomes, while reducing overall healthcare costs. PMID:25126292

  19. Updates on Modeling the Water Cycle with the NASA Ames Mars Global Climate Model

    NASA Technical Reports Server (NTRS)

    Kahre, M. A.; Haberle, R. M.; Hollingsworth, J. L.; Montmessin, F.; Brecht, A. S.; Urata, R.; Klassen, D. R.; Wolff, M. J.

    2017-01-01

    Global Circulation Models (GCMs) have made steady progress in simulating the current Mars water cycle. It is now widely recognized that clouds are a critical component that can significantly affect the nature of the simulated water cycle. Two processes in particular are key to implementing clouds in a GCM: the microphysical processes of formation and dissipation, and their radiative effects on heating/ cooling rates. Together, these processes alter the thermal structure, change the dynamics, and regulate inter-hemispheric transport. We have made considerable progress representing these processes in the NASA Ames GCM, particularly in the presence of radiatively active water ice clouds. We present the current state of our group's water cycle modeling efforts, show results from selected simulations, highlight some of the issues, and discuss avenues for further investigation.­

  20. An Evaluation of the "Your Style of Learning and Thinking" Inventory.

    ERIC Educational Resources Information Center

    Fitzgerald, D.; Hattie, J. A.

    1983-01-01

    Assesses an effort by E.P. Torrance and co-workers to introduce a test they claim is based on a model of cognitive style that relates creative processes to hemispheric dominance. Challenges Torrance's theories and research methods and asserts that a model relating left and right brain dominance to creativity has little value. (GC)

  1. Implications of Bandura's Observational Learning Theory for a Competency Based Teacher Education Model.

    ERIC Educational Resources Information Center

    Hartjen, Raymond H.

    Albert Bandura of Stanford University has proposed four component processes to his theory of observational learning: a) attention, b) retention, c) motor reproduction, and d) reinforcement and motivation. This study represents one phase of an effort to relate modeling and observational learning theory to teacher training. The problem of this study…

  2. The Lunar Phases Project: A Mental Model-Based Observational Project for Undergraduate Nonscience Majors

    ERIC Educational Resources Information Center

    Meyer, Angela Osterman; Mon, Manuel J.; Hibbard, Susan T.

    2011-01-01

    We present our Lunar Phases Project, an ongoing effort utilizing students' actual observations within a mental model building framework to improve student understanding of the causes and process of the lunar phases. We implement this project with a sample of undergraduate, nonscience major students enrolled in a midsized public university located…

  3. CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010

    DTIC Science & Technology

    2010-11-01

    Model of archi- tectural design. It guides developers to apply effort to their software architecture commensurate with the risks faced by...Driven Model is the promotion of risk to prominence. It is possible to apply the Risk-Driven Model to essentially any software development process...succeed without any planned architecture work, while many high-risk projects would fail without it . The Risk-Driven Model walks a middle path

  4. How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

    PubMed Central

    Meyniel, Florent; Safra, Lou; Pessiglione, Mathias

    2014-01-01

    A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711

  5. A Multimodal Approach to Counselor Supervision.

    ERIC Educational Resources Information Center

    Ponterotto, Joseph G.; Zander, Toni A.

    1984-01-01

    Represents an initial effort to apply Lazarus's multimodal approach to a model of counselor supervision. Includes continuously monitoring the trainee's behavior, affect, sensations, images, cognitions, interpersonal functioning, and when appropriate, biological functioning (diet and drugs) in the supervisory process. (LLL)

  6. Building a Unified Information Network.

    ERIC Educational Resources Information Center

    Avram, Henriette D.

    1988-01-01

    Discusses cooperative efforts between research organizations and libraries to create a national information network. Topics discussed include the Linked System Project (LSP); technical processing versus reference and research functions; Open Systems Interconnection (OSI) Reference Model; the National Science Foundation Network (NSFNET); and…

  7. Defining Scenarios: Linking Integrated Models, Regional Concerns, and Stakeholders

    NASA Astrophysics Data System (ADS)

    Hartmann, H. C.; Stewart, S.; Liu, Y.; Mahmoud, M.

    2007-05-01

    Scenarios are important tools for long-term planning, and there is great interest in using integrated models in scenario studies. However, scenario definition and assessment are creative, as well as scientific, efforts. Using facilitated creative processes, we have worked with stakeholders to define regionally significant scenarios that encompass a broad range of hydroclimatic, socioeconomic, and institutional dimensions. The regional scenarios subsequently inform the definition of local scenarios that work with context-specific integrated models that, individually, can address only a subset of overall regional complexity. Based on concerns of stakeholders in the semi-arid US Southwest, we prioritized three dimensions that are especially important, yet highly uncertain, for long-term planning: hydroclimatic conditions (increased variability, persistent drought), development patterns (urban consolidation, distributed rural development), and the nature of public institutions (stressed, proactive). Linking across real-world decision contexts and integrated modeling efforts poses challenges of creatively connecting the conceptual models held by both the research and stakeholder communities.

  8. Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2

    NASA Technical Reports Server (NTRS)

    Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.

    1977-01-01

    The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.

  9. Results of the Workshop on Impact Cratering: Bridging the Gap Between Modeling and Observations

    NASA Technical Reports Server (NTRS)

    Herrick, Robert (Editor); Pierazzo, Elisabetta (Editor)

    2003-01-01

    On February 7-9,2003, approximately 60 scientists gathered at the Lunar and Planetary Institute in Houston, Texas, for a workshop devoted to improving knowledge of the impact cratering process. We (co-conveners Elisabetta Pierazzo and Robert Herrick) both focus research efforts on studying the impact cratering process, but the former specializes in numerical modeling while the latter draws inferences from observations of planetary craters. Significant work has been done in several key areas of impact studies over the past several years, but in many respects there seem to be a disconnect between the groups employing different approaches, in particular modeling versus observations. The goal in convening this workshop was to bring together these disparate groups to have an open dialogue for the purposes of answering outstanding questions about the impact process and setting future research directions. We were successful in getting participation from most of the major research groups studying the impact process. Participants gathered from five continents with research specialties ranging from numerical modeling to field geology, and from small-scale experimentation and geochemical sample analysis to seismology and remote sensing.With the assistance of the scientific advisory committee (Bevan French, Kevin Housen, Bill McKinnon, Jay Melosh, and Mike Zolensky), the workshop was divided into a series of sessions devoted to different aspects of the cratering process. Each session was opened by two invited t a b , one given by a specialist in numerical or experimental modeling approaches, and the other by a specialist in geological, geophysical, or geochemical observations. Shorter invited and contributed talks filled out the sessions, which were then concluded with an open discussion time. All modelers were requested to address the question of what observations would better constrain their models, and all observationists were requested to discuss how their observations can constrain modeling efforts.

  10. Quantitative computational models of molecular self-assembly in systems biology

    PubMed Central

    Thomas, Marcus; Schwartz, Russell

    2017-01-01

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally. PMID:28535149

  11. Quantitative computational models of molecular self-assembly in systems biology.

    PubMed

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  12. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  13. Review and Synopsis of Natural and Human Controls on Fluvial Channel Processes in the Arid West

    DTIC Science & Technology

    2007-09-01

    Parallel to ongoing efforts to revise the U.S. Army Corps of Engineers wetland delineation manual for support of Section 404 under the Clean Water ...Act, the Corps has initiated an effort to develop an Ordinary High Water (OHW) delineation manual. The Arid West region is dominated by watersheds with...features. With a better understanding of the stream dynamics associated with regulated ordinary events, the Corps is now developing OHW functional models

  14. The Case of Middle and High School Chemistry Teachers Implementing Technology: Using the Concerns-Based Adoption Model to Assess Change Processes

    ERIC Educational Resources Information Center

    Gabby, Shwartz; Avargil, Shirly; Herscovitz, Orit; Dori, Yehudit Judy

    2017-01-01

    An ongoing process of reforming chemical education in middle and high schools in our country introduced the technology-enhanced learning environment (TELE) to chemistry classes. Teachers are encouraged to integrate technology into pedagogical practices in meaningful ways to promote 21st century skills; however, this effort is often hindered by…

  15. Microdesigning of Lightweight/High Strength Ceramic Materials

    DTIC Science & Technology

    1989-07-31

    Continue on reverse if necessary and identiy by block number) FIELD GROUP SUB- GROUP Ceramics, Composite Materials, Colloidal Processing Iii 19. ABSTRACT...to identify key processing parameters that affect the microstructure of the composite material. The second section describes experimental results in...results of the significant theoretical effort made in our group . Theoretical models of particle-particle interaction, particle-polymer interaction

  16. Land Processes Distributed Active Archive Center (LP DAAC) 25th Anniversary Recognition "A Model for Government Partnerships". LP DAAC "History and a Look Forward"

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Doescher, Chris

    2015-01-01

    This presentation discusses 25 years of interactions between NASA and the USGS to manage a Land Processes Distributed Active Archive Center (LPDAAC) for the purpose of providing users access to NASA's rich collection of Earth Science data. The presentation addresses challenges, efforts and metrics on the performance.

  17. Studies for the Loss of Atomic and Molecular Species from Io

    NASA Technical Reports Server (NTRS)

    Smyth, William H.

    1998-01-01

    Continued effort is reported to improve the emission rates of various emission lines for atomic oxygen and sulfur. Atomic hydrogen has been included as a new species in the neutral cloud model. The pertinent lifetime processes for hydrogen in the plasma torus and the relevant excitation processes for H Lyman-alpha emission in Io's atmosphere are discussed.

  18. Teaching adaptive leadership to family medicine residents: what? why? how?

    PubMed

    Eubank, Daniel; Geffken, Dominic; Orzano, John; Ricci, Rocco

    2012-09-01

    Health care reform calls for patient-centered medical homes built around whole person care and healing relationships. Efforts to transform primary care practices and deliver these qualities have been challenging. This study describes one Family Medicine residency's efforts to develop an adaptive leadership curriculum and use coaching as a teaching method to address this challenge. We review literature that describes a parallel between the skills underlying such care and those required for adaptive leadership. We address two questions: What is leadership? Why focus on adaptive leadership? We then present a synthesis of leadership theories as a set of process skills that lead to organization learning through effective work relationships and adaptive leadership. Four models of the learning process needed to acquire such skills are explored. Coaching is proposed as a teaching method useful for going beyond information transfer to create the experiential learning necessary to acquire the process skills. Evaluations of our efforts to date are summarized. We discuss key challenges to implementing such a curriculum and propose that teaching adaptive leadership is feasible but difficult in the current medical education and practice contexts.

  19. Detailed Modeling of Distillation Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.

    2011-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA?s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents efforts to develop chemical process simulations for three technologies: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system and the Wiped-Film Rotating Disk (WFRD) using the Aspen Custom Modeler and Aspen Plus process simulation tools. The paper discusses system design, modeling details, and modeling results for each technology and presents some comparisons between the model results and recent test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  20. Dynamic Modeling of Yield and Particle Size Distribution in Continuous Bayer Precipitation

    NASA Astrophysics Data System (ADS)

    Stephenson, Jerry L.; Kapraun, Chris

    Process engineers at Alcoa's Point Comfort refinery are using a dynamic model of the Bayer precipitation area to evaluate options in operating strategies. The dynamic model, a joint development effort between Point Comfort and the Alcoa Technical Center, predicts process yields, particle size distributions and occluded soda levels for various flowsheet configurations of the precipitation and classification circuit. In addition to rigorous heat, material and particle population balances, the model includes mechanistic kinetic expressions for particle growth and agglomeration and semi-empirical kinetics for nucleation and attrition. The kinetic parameters have been tuned to Point Comfort's operating data, with excellent matches between the model results and plant data. The model is written for the ACSL dynamic simulation program with specifically developed input/output graphical user interfaces to provide a user-friendly tool. Features such as a seed charge controller enhance the model's usefulness for evaluating operating conditions and process control approaches.

  1. System-level modeling of acetone-butanol-ethanol fermentation.

    PubMed

    Liao, Chen; Seo, Seung-Oh; Lu, Ting

    2016-05-01

    Acetone-butanol-ethanol (ABE) fermentation is a metabolic process of clostridia that produces bio-based solvents including butanol. It is enabled by an underlying metabolic reaction network and modulated by cellular gene regulation and environmental cues. Mathematical modeling has served as a valuable strategy to facilitate the understanding, characterization and optimization of this process. In this review, we highlight recent advances in system-level, quantitative modeling of ABE fermentation. We begin with an overview of integrative processes underlying the fermentation. Next we survey modeling efforts including early simple models, models with a systematic metabolic description, and those incorporating metabolism through simple gene regulation. Particular focus is given to a recent system-level model that integrates the metabolic reactions, gene regulation and environmental cues. We conclude by discussing the remaining challenges and future directions towards predictive understanding of ABE fermentation. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. “Using Statistical Comparisons between SPartICus Cirrus Microphysical Measurements, Detailed Cloud Models, and GCM Cloud Parameterizations to Understand Physical Processes Controlling Cirrus Properties and to Improve the Cloud Parameterizations”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Sarah

    2015-12-01

    The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.

  3. Modeling and analysis of chill and fill processes for the cryogenic storage and transfer engineering development unit tank

    NASA Astrophysics Data System (ADS)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; LeClair, A. C.

    2016-03-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center, is a cryogenic fluid management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article comprises a flight-like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen (LH2) in a test-like vacuum environment. A series of tests, with LH2 as a testing fluid, was conducted at Test Stand 300 at MSFC during the summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. The Generalized Fluid System Simulation Program, an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the LH2 supply source, feed system, EDU tank, and vent system. The test setup, modeling description, and comparison of model predictions with the test data are presented.

  4. Modeling and Analysis of Chill and Fill Processes for the EDU Tank

    NASA Technical Reports Server (NTRS)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; Leclair, A. C.

    2015-01-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center (GRC), is a Cryogenic Fluid Management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article, comprises a flight like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen in a space-like vacuum environment. A series of tests, with liquid hydrogen as a testing fluid, was conducted at Test Stand 300 at MSFC during summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. Generalized Fluid System Simulation Program (GFSSP), an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the liquid hydrogen supply source, feed system, EDU tank, and vent system. The modeling description and comparison of model predictions with the test data will be presented in the final paper.

  5. Basin-scale hydrogeologic modeling

    NASA Astrophysics Data System (ADS)

    Person, Mark; Raffensperger, Jeff P.; Ge, Shemin; Garven, Grant

    1996-02-01

    Mathematical modeling of coupled groundwater flow, heat transfer, and chemical mass transport at the sedimentary basin scale has been increasingly used by Earth scientists studying a wide range of geologic processes including the formation of excess pore pressures, infiltration-driven metamorphism, heat flow anomalies, nuclear waste isolation, hydrothermal ore genesis, sediment diagenesis, basin tectonics, and petroleum generation and migration. These models have provided important insights into the rates and pathways of groundwater migration through basins, the relative importance of different driving mechanisms for fluid flow, and the nature of coupling between the hydraulic, thermal, chemical, and stress regimes. The mathematical descriptions of basin transport processes, the analytical and numerical solution methods employed, and the application of modeling to sedimentary basins around the world are the subject of this review paper. The special considerations made to represent coupled transport processes at the basin scale are emphasized. Future modeling efforts will probably utilize three-dimensional descriptions of transport processes, incorporate greater information regarding natural geological heterogeneity, further explore coupled processes, and involve greater field applications.

  6. Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform.

    PubMed

    Marshall-Colon, Amy; Long, Stephen P; Allen, Douglas K; Allen, Gabrielle; Beard, Daniel A; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A J; Cox, Donna J; Hart, John C; Hirst, Peter M; Kannan, Kavya; Katz, Daniel S; Lynch, Jonathan P; Millar, Andrew J; Panneerselvam, Balaji; Price, Nathan D; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J; Voit, Eberhard O; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang

    2017-01-01

    Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop.

  7. Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform

    PubMed Central

    Marshall-Colon, Amy; Long, Stephen P.; Allen, Douglas K.; Allen, Gabrielle; Beard, Daniel A.; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A. J.; Cox, Donna J.; Hart, John C.; Hirst, Peter M.; Kannan, Kavya; Katz, Daniel S.; Lynch, Jonathan P.; Millar, Andrew J.; Panneerselvam, Balaji; Price, Nathan D.; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G.; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J.; Voit, Eberhard O.; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang

    2017-01-01

    Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop. PMID:28555150

  8. LANDSCAPE ASSESSMENT TOOLS FOR WATERSHED CHARACTERIZATION

    EPA Science Inventory

    A combination of process-based, empirical and statistical models has been developed to assist states in their efforts to assess water quality, locate impairments over large areas, and calculate TMDL allocations. By synthesizing outputs from a number of these tools, LIPS demonstr...

  9. Self Evaluation of Organizations.

    ERIC Educational Resources Information Center

    Pooley, Richard C.

    Evaluation within human service organizations is defined in terms of accepted evaluation criteria, with reasonable expectations shown and structured into a model of systematic evaluation practice. The evaluation criteria of program effort, performance, adequacy, efficiency and process mechanisms are discussed, along with measurement information…

  10. Policy Change and the National Essential Medicines List Development Process in Brazil between 2000 and 2014: Has the Essential Medicine Concept been Abandoned?

    PubMed

    Osorio-de-Castro, Claudia G S; Azeredo, Thiago B; Pepe, Vera L E; Lopes, Luciane C; Yamauti, Sueli; Godman, Brian; Gustafsson, Lars L

    2018-04-01

    Brazil has had a National Essential Medicines List (EML) since 1964. From 2000 to 2010, five consecutive evidence-based editions were produced, building on the essential medicine concept. In 2012, the government changed course to establish a new paradigm, introducing adoption of new medicines as the main aim within the recommendation process. The objective of the article was to report efforts to develop Brazil's national EML, policy changes from 2000 to 2014, discussing results, challenges and perspectives. Brazilian EML history and development process were collected from legislation, minutes, reports and legal ordinances, from 2000 to 2014. The Brazilian EML and the WHO Model Lists were compared using the Anatomical Therapeutic Chemical system. Overlap between lists was verified, and linear trends were produced. Type of membership, inclusion criteria, procedures, flow and listed medicines varied greatly between the selection committees acting before and after 2012. Paradigm-changing legislation aiming at linking list compliance to public financing in 2012 produced (i) greater importance given to political and administrative stakeholders, (ii) increasing trends in number of medicines over the years, (iii) decrease in use of WHO Model List as a reference and (iv) substitution of an essential medicines list review and update process by an adoption decision output. Other issues remained unchanged. Insufficient efforts for list implementation, such as lack of physician education, presented consequences to the health system. Substantial efforts were made to produce and update the list from 2000 to 2014. However, continuous and intense health litigation disproves process outcome effectiveness. © 2017 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  11. Sensitivity to cognitive effort mediates psychostimulant effects on a novel rodent cost/benefit decision-making task.

    PubMed

    Cocker, Paul J; Hosking, Jay G; Benoit, James; Winstanley, Catharine A

    2012-07-01

    Amotivational states and insufficient recruitment of mental effort have been observed in a variety of clinical populations, including depression, traumatic brain injury, post-traumatic stress disorder, and attention deficit hyperactivity disorder. Previous rodent models of effort-based decision making have utilized physical costs whereas human studies of effort are primarily cognitive in nature, and it is unclear whether the two types of effortful decision making are underpinned by the same neurobiological processes. We therefore designed a novel rat cognitive effort task (rCET) based on the 5-choice serial reaction time task, a well-validated measure of attention and impulsivity. Within each trial of the rCET, rats are given the choice between an easy or hard visuospatial discrimination, and successful hard trials are rewarded with double the number of sugar pellets. Similar to previous human studies, stable individual variation in choice behavior was observed, with 'workers' choosing hard trials significantly more than their 'slacker' counterparts. Whereas workers 'slacked off' in response to administration of amphetamine and caffeine, slackers 'worked harder' under amphetamine, but not caffeine. Conversely, these stimulants increased motor impulsivity in all animals. Ethanol did not affect animals' choice but invigorated behavior. In sum, we have shown for the first time that rats are differentially sensitive to cognitive effort when making decisions, independent of other processes such as impulsivity, and these baseline differences can influence the cognitive response to psychostimulants. Such findings could inform our understanding of impairments in effort-based decision making and contribute to treatment development.

  12. Sensitivity to Cognitive Effort Mediates Psychostimulant Effects on a Novel Rodent Cost/Benefit Decision-Making Task

    PubMed Central

    Cocker, Paul J; Hosking, Jay G; Benoit, James; Winstanley, Catharine A

    2012-01-01

    Amotivational states and insufficient recruitment of mental effort have been observed in a variety of clinical populations, including depression, traumatic brain injury, post-traumatic stress disorder, and attention deficit hyperactivity disorder. Previous rodent models of effort-based decision making have utilized physical costs whereas human studies of effort are primarily cognitive in nature, and it is unclear whether the two types of effortful decision making are underpinned by the same neurobiological processes. We therefore designed a novel rat cognitive effort task (rCET) based on the 5-choice serial reaction time task, a well-validated measure of attention and impulsivity. Within each trial of the rCET, rats are given the choice between an easy or hard visuospatial discrimination, and successful hard trials are rewarded with double the number of sugar pellets. Similar to previous human studies, stable individual variation in choice behavior was observed, with ‘workers' choosing hard trials significantly more than their ‘slacker' counterparts. Whereas workers ‘slacked off' in response to administration of amphetamine and caffeine, slackers ‘worked harder' under amphetamine, but not caffeine. Conversely, these stimulants increased motor impulsivity in all animals. Ethanol did not affect animals' choice but invigorated behavior. In sum, we have shown for the first time that rats are differentially sensitive to cognitive effort when making decisions, independent of other processes such as impulsivity, and these baseline differences can influence the cognitive response to psychostimulants. Such findings could inform our understanding of impairments in effort-based decision making and contribute to treatment development. PMID:22453140

  13. Online low-field NMR spectroscopy for process control of an industrial lithiation reaction-automated data analysis.

    PubMed

    Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael

    2018-05-01

    Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  15. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  16. Boolean Dynamic Modeling Approaches to Study Plant Gene Regulatory Networks: Integration, Validation, and Prediction.

    PubMed

    Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R

    2017-01-01

    Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.

  17. Scenario Development Process at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Reardon, Scott E.; Beard, Steven D.; Lewis, Emily

    2017-01-01

    There has been a significant effort within the simulation community to standardize many aspects of flight simulation. More recently, an effort has begun to develop a formal scenario definition language for aviation. A working group within the AIAA Modeling and Simulation Technical Committee has been created to develop a standard aviation scenario definition language, though much of the initial effort has been tailored to training simulators. Research and development (R&D) simulators, like the Vertical Motion Simulator (VMS), and training simulators have different missions and thus have different scenario requirements. The purpose of this paper is to highlight some of the unique tasks and scenario elements used at the VMS so they may be captured by scenario standardization efforts. The VMS most often performs handling qualities studies and transfer of training studies. Three representative handling qualities simulation studies and two transfer of training simulation studies are described in this paper. Unique scenario elements discussed in this paper included special out-the-window (OTW) targets and environmental conditions, motion system parameters, active inceptor parameters, and configurable vehicle math model parameters.

  18. Understanding HIV disclosure: A review and application of the Disclosure Processes Model

    PubMed Central

    Chaudoir, Stephenie R.; Fisher, Jeffrey D.; Simoni, Jane M.

    2014-01-01

    HIV disclosure is a critical component of HIV/AIDS prevention and treatment efforts, yet the field lacks a comprehensive theoretical framework with which to study how HIV-positive individuals make decisions about disclosing their serostatus and how these decisions affect them. Recent theorizing in the context of the Disclosure Processes Model has suggested that the disclosure process consists of antecedent goals, the disclosure event itself, mediating processes and outcomes, and a feedback loop. In this paper, we apply this new theoretical framework to HIV disclosure in order to review the current state of the literature, identify gaps in existing research, and highlight the implications of the framework for future work in this area. PMID:21514708

  19. Biological-Physical Coupling in the Gulf of Maine: Satellite and Model Studies of Phytoplankton Variability

    NASA Technical Reports Server (NTRS)

    Thomas, Andrew C.; Chai, F.; Townsend, D. W.; Xue, H.

    2002-01-01

    The goals of this project were to acquire, process, QC, archive and analyze SeaWiFS chlorophyll fields over the Gulf of Maine and Scotia Shelf region. The focus of the analysis effort was to calculate and quantify seasonality and interannual. variability of SeaWiFS-measured phytoplankton biomass in the study area and compare these to physical forcing and hydrography. An additional focus within this effort was on regional differences within the heterogeneous biophysical regions of the Gulf of Maine / Scotia Shelf. Overall goals were approached through the combined use of SeaWiFS and AVHRR data and the development of a coupled biology-physical numerical model.

  20. The single-zone numerical model of homogeneous charge compression ignition engine performance

    NASA Astrophysics Data System (ADS)

    Fedyanov, E. A.; Itkis, E. M.; Kuzmin, V. N.; Shumskiy, S. N.

    2017-02-01

    The single-zone model of methane-air mixture combustion in the Homogeneous Charge Compression Ignition engine was developed. First modeling efforts resulted in the selection of the detailed kinetic reaction mechanism, most appropriate for the conditions of the HCCI process. Then, the model was completed so as to simulate the performance of the four-stroke engine and was coupled by physically reasonable adjusting functions. Validation of calculations against experimental data showed acceptable agreement.

  1. Multi-Model Comparison of Lateral Boundary Contributions to ...

    EPA Pesticide Factsheets

    As the National Ambient Air Quality Standards (NAAQS) for ozone become more stringent, there has been growing attention on characterizing the contributions and the uncertainties in ozone from outside the US to the ozone concentrations within the US. The third phase of the Air Quality Model Evaluation International Initiative (AQMEII3) provides an opportunity to investigate this issue through the combined efforts of multiple research groups in the US and Europe. The model results cover a range of representations of chemical and physical processes, vertical and horizontal resolutions, and meteorological fields to drive the regional chemical transport models (CTMs), all of which are important components of model uncertainty (Solazzo and Galmarini, 2016). In AQMEII3, all groups were asked to track the contribution of ozone from lateral boundary through the use of chemically inert tracers. Though the inert tracer method tends to overestimate the impact of ozone boundary conditions compared with other methods such as chemically reactive tracers and source apportionment (Baker et al., 2015), the method takes the least effort to implement in different models, and is thus useful in highlighting and understanding the process-level differences amongst the models. In this study, results from four models were included (CMAQ driven by WRF, CAMx driven by WRF, CMAQ driven by CCLM, DEHM driven by WRF). At each site, the distribution of daily maximum 8-hour ozone, and the corre

  2. Nuclear Explosion Monitoring Advances and Challenges

    NASA Astrophysics Data System (ADS)

    Baker, G. E.

    2015-12-01

    We address the state-of-the-art in areas important to monitoring, current challenges, specific efforts that illustrate approaches addressing shortcomings in capabilities, and additional approaches that might be helpful. The exponential increase in the number of events that must be screened as magnitude thresholds decrease presents one of the greatest challenges. Ongoing efforts to exploit repeat seismic events using waveform correlation, subspace methods, and empirical matched field processing holds as much "game-changing" promise as anything being done, and further efforts to develop and apply such methods efficiently are critical. Greater accuracy of travel time, signal loss, and full waveform predictions are still needed to better locate and discriminate seismic events. Important developments include methods to model velocities using multiple types of data; to model attenuation with better separation of source, path, and site effects; and to model focusing and defocusing of surface waves. Current efforts to model higher frequency full waveforms are likely to improve source characterization while more effective estimation of attenuation from ambient noise holds promise for filling in gaps. Censoring in attenuation modeling is a critical problem to address. Quantifying uncertainty of discriminants is key to their operational use. Efforts to do so for moment tensor (MT) inversion are particularly important, and fundamental progress on the statistics of MT distributions is the most important advance needed in the near term in this area. Source physics is seeing great progress through theoretical, experimental, and simulation studies. The biggest need is to accurately predict the effects of source conditions on seismic generation. Uniqueness is the challenge here. Progress will depend on studies that probe what distinguishes mechanisms, rather than whether one of many possible mechanisms is consistent with some set of observations.

  3. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  4. Granularity as a Cognitive Factor in the Effectiveness of Business Process Model Reuse

    NASA Astrophysics Data System (ADS)

    Holschke, Oliver; Rake, Jannis; Levina, Olga

    Reusing design models is an attractive approach in business process modeling as modeling efficiency and quality of design outcomes may be significantly improved. However, reusing conceptual models is not a cost-free effort, but has to be carefully designed. While factors such as psychological anchoring and task-adequacy in reuse-based modeling tasks have been investigated, information granularity as a cognitive concept has not been at the center of empirical research yet. We hypothesize that business process granularity as a factor in design tasks under reuse has a significant impact on the effectiveness of resulting business process models. We test our hypothesis in a comparative study employing high and low granularities. The reusable processes provided were taken from widely accessible reference models for the telecommunication industry (enhanced Telecom Operations Map). First experimental results show that Recall in tasks involving coarser granularity is lower than in cases of finer granularity. These findings suggest that decision makers in business process management should be considerate with regard to the implementation of reuse mechanisms of different granularities. We realize that due to our small sample size results are not statistically significant, but this preliminary run shows that it is ready for running on a larger scale.

  5. An adaptable architecture for patient cohort identification from diverse data sources

    PubMed Central

    Bache, Richard; Miles, Simon; Taweel, Adel

    2013-01-01

    Objective We define and validate an architecture for systems that identify patient cohorts for clinical trials from multiple heterogeneous data sources. This architecture has an explicit query model capable of supporting temporal reasoning and expressing eligibility criteria independently of the representation of the data used to evaluate them. Method The architecture has the key feature that queries defined according to the query model are both pre and post-processed and this is used to address both structural and semantic heterogeneity. The process of extracting the relevant clinical facts is separated from the process of reasoning about them. A specific instance of the query model is then defined and implemented. Results We show that the specific instance of the query model has wide applicability. We then describe how it is used to access three diverse data warehouses to determine patient counts. Discussion Although the proposed architecture requires greater effort to implement the query model than would be the case for using just SQL and accessing a data-based management system directly, this effort is justified because it supports both temporal reasoning and heterogeneous data sources. The query model only needs to be implemented once no matter how many data sources are accessed. Each additional source requires only the implementation of a lightweight adaptor. Conclusions The architecture has been used to implement a specific query model that can express complex eligibility criteria and access three diverse data warehouses thus demonstrating the feasibility of this approach in dealing with temporal reasoning and data heterogeneity. PMID:24064442

  6. Applying modeling Results in designing a global tropospheric experiment

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of field experiments and advanced modeling studies which provide a strategy for a program of global tropospheric experiments was identified. An expanded effort to develop space applications for trospheric air quality monitoring and studies was recommended. The tropospheric ozone, carbon, nitrogen, and sulfur cycles are addressed. Stratospheric-tropospheric exchange is discussed. Fast photochemical processes in the free troposphere are considered.

  7. Mathematical Creative Process Wallas Model in Students Problem Posing with Lesson Study Approach

    ERIC Educational Resources Information Center

    Nuha, Muhammad 'Azmi; Waluya, S. B.; Junaedi, Iwan

    2018-01-01

    Creative thinking is very important in the modern era so that it should be improved by doing efforts such as making a lesson that train students to pose their own problems. The purposes of this research are (1) to give an initial description of students about mathematical creative thinking level in Problem Posing Model with Lesson Study approach…

  8. Analysis of Geometric Thinking Students’ and Process-Guided Inquiry Learning Model

    NASA Astrophysics Data System (ADS)

    Hardianti, D.; Priatna, N.; Priatna, B. A.

    2017-09-01

    This research aims to analysis students’ geometric thinking ability and theoretically examine the process-oriented guided iquiry (POGIL) model. This study uses qualitative approach with descriptive method because this research was done without any treatment on subjects. Data were collected naturally. This study was conducted in one of the State Junior High School in Bandung. The population was second grade students and the sample was 32 students. Data of students’ geometric thinking ability were collected through geometric thinking test. These questions are made based on the characteristics of geometry thinking based on van hiele’s theory. Based on the results of the analysis and discussion, students’ geometric thinking ability is still low so it needs to be improved. Therefore, an effort is needed to overcome the problems related to students’ geometric thinking ability. One of the efforts that can be done by doing the learning that can facilitate the students to construct their own geometry concept, especially quadrilateral’s concepts so that students’ geometric thinking ability can enhance maximally. Based on study of the theory, one of the learning models that can enhance the students’ geometric thinking ability is POGIL model.

  9. Memory mechanisms supporting syntactic comprehension

    PubMed Central

    Waters, Gloria

    2013-01-01

    Efforts to characterize the memory system that supports sentence comprehension have historically drawn extensively on short-term memory as a source of mechanisms that might apply to sentences. The focus of these efforts has changed significantly in the past decade. As a result of changes in models of short-term working memory (ST-WM) and developments in models of sentence comprehension, the effort to relate entire components of an ST-WM system, such as those in the model developed by Baddeley (Nature Reviews Neuroscience 4: 829–839, 2003) to sentence comprehension has largely been replaced by an effort to relate more specific mechanisms found in modern models of ST-WM to memory processes that support one aspect of sentence comprehension—the assignment of syntactic structure (parsing) and its use in determining sentence meaning (interpretation) during sentence comprehension. In this article, we present the historical background to recent studies of the memory mechanisms that support parsing and interpretation and review recent research into this relation. We argue that the results of this research do not converge on a set of mechanisms derived from ST-WM that apply to parsing and interpretation. We argue that the memory mechanisms supporting parsing and interpretation have features that characterize another memory system that has been postulated to account for skilled performance—long-term working memory. We propose a model of the relation of different aspects of parsing and interpretation to ST-WM and long-term working memory. PMID:23319178

  10. The experience of freedom in decisions - Questioning philosophical beliefs in favor of psychological determinants.

    PubMed

    Lau, Stephan; Hiemisch, Anette; Baumeister, Roy F

    2015-05-01

    Six experiments tested two competing models of subjective freedom during decision-making. The process model is mainly based on philosophical conceptions of free will and assumes that features of the process of choosing affect subjective feelings of freedom. In contrast, the outcome model predicts that subjective freedom is due to positive outcomes that can be expected or are achieved by a decision. Results heavily favored the outcome model over the process model. For example, participants felt freer when choosing between two equally good than two equally bad options. Process features including number of options, complexity of decision, uncertainty, having the option to defer the decision, conflict among reasons, and investing high effort in choosing generally had no or even negative effects on subjective freedom. In contrast, participants reported high freedom with good outcomes and low freedom with bad outcomes, and ease of deciding increased subjective freedom, consistent with the outcome model. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  12. Streamline three-dimensional thermal model of a lithium titanate pouch cell battery in extreme temperature conditions with module simulation

    NASA Astrophysics Data System (ADS)

    Jaguemont, Joris; Omar, Noshin; Martel, François; Van den Bossche, Peter; Van Mierlo, Joeri

    2017-11-01

    In this paper, the development of a three-dimensional (3D) lithium titanium oxide (LTO) pouch cell is presented to first better comprehend its thermal behavior within electrified vehicle applications, but also to propose a strong modeling base for future thermal management system. Current 3D-thermal models are based on electrochemical reactions which are in need for elaborated meshing effort and long computational time. There lacks a fast electro-thermal model which can capture voltage, current and thermal distribution variation during the whole process. The proposed thermal model is a reduce-effort temperature simulation approach involving a 0D-electrical model accommodating a 3D-thermal model to exclude electrochemical processes. The thermal model is based on heat-transfer theory and its temperature distribution prediction incorporates internal conduction and heat generation effect as well as convection. In addition, experimental tests are conducted to validate the model. Results show that both the heat dissipation rate and surface temperature uniformity data are in agreement with simulation results, which satisfies the application requirements for electrified vehicles. Additionally, a LTO battery pack sizing and modeling is also designed, applied and displays a non-uniformity of the cells under driving operation. Ultimately, the model will serve as a basis for the future development of a thermal strategy for LTO cells that operate in a large temperature range, which is a strong contribution to the existing body of scientific literature.

  13. Mechanistic ecohydrological modeling with Tethys-Chloris: an attempt to unravel complexity

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2010-12-01

    The role of vegetation in controlling and mediating hydrological states and fluxes at the level of individual processes has been largely explored, which has lead to the improvement of our understanding of mechanisms and patterns in ecohydrological systems. Nonetheless, relatively few efforts have been directed toward the development of continuous, complex, mechanistic ecohydrological models operating at the watershed-scale. This study presents a novel ecohydrological model Tethys-Chloris (T&C) and aims to discuss current limitations and perspectives of the mechanistic approach in ecohydrology. The model attempts to synthesize the state-of-the-art knowledge on individual processes and mechanisms drawn from various disciplines such as hydrology, plant physiology, ecology, and biogeochemistry. The model reproduces all essential components of hydrological cycle resolving the mass and energy budgets at the hourly scale; it includes energy and mass exchanges in the atmospheric boundary layer; a module of saturated and unsaturated soil water dynamics; two layers of vegetation, and a module of snowpack evolution. The vegetation component parsimoniously parameterizes essential plant life-cycle processes, including photosynthesis, phenology, carbon allocation, tissues turnover, and soil biogeochemistry. Quantitative metrics of model performance are discussed and highlight the capabilities of T&C in reproducing ecohydrological dynamics. The simulated patterns mimic the outcome of hydrological dynamics with high realism, given the uncertainty of imposed boundary conditions and limited data availability. Furthermore, highly satisfactory results are obtained without significant (e.g., automated) calibration efforts despite the large phase-space dimensionality of the model. A significant investment into model design and development leads to such desirable behavior. This suggests that while using the presented tool for high-precision predictions can be still problematic, the mechanistic nature of the model can be extremely valuable for designing virtual experiments, testing hypotheses. and focusing questions of scientific inquiry.

  14. Ultraviolet Communication for Medical Applications

    DTIC Science & Technology

    2015-06-01

    In the previous Phase I effort, Directed Energy Inc.’s (DEI) parent company Imaging Systems Technology (IST) demonstrated feasibility of several key...accurately model high path loss. Custom photon scatter code was rewritten for parallel execution on a graphics processing unit (GPU). The NVidia CUDA

  15. Development of safety performance functions for North Carolina.

    DOT National Transportation Integrated Search

    2011-12-06

    "The objective of this effort is to develop safety performance functions (SPFs) for different types of facilities in North Carolina : and illustrate how they can be used to improve the decision making process. The prediction models in Part C of the H...

  16. Would You Recommend Your Institution's Effort-Reporting Process to Others? Determining Best Practices in Effort-Reporting Compliance

    ERIC Educational Resources Information Center

    Whitaker, Ashley E.

    2015-01-01

    Effort-reporting compliance at higher education institutions was examined to discern best practices from those that would recommend their effort-reporting process. Data were derived from a survey of effort administrators--the research administrators responsible for the effort-reporting compliance program at their respective higher education…

  17. A CMMI-based approach for medical software project life cycle study.

    PubMed

    Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi

    2013-01-01

    In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.

  18. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  19. Planning Study to Establish DoD Manufacturing Technology Information Analysis Center.

    DTIC Science & Technology

    1981-01-01

    model for an MTIAC. 5-3 I Type of information inputs from potential MTIAC sources. 5-5 5-3 Processing functions required to produce MTIAC outputs. 5-8...short supply * Energy conservation and concerns of energy inten- siveness of various manufacturing processes and systems required for production of DOD...not play a major role in the process of MT invention, innovation, or diffusion. MT productivity efforts for private industry are carried out by

  20. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  1. Ontology-based tools to expedite predictive model construction.

    PubMed

    Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey

    2014-01-01

    Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.

  2. Four simple ocean carbon models

    NASA Technical Reports Server (NTRS)

    Moore, Berrien, III

    1992-01-01

    This paper briefly reviews the key processes that determine oceanic CO2 uptake and sets this description within the context of four simple ocean carbon models. These models capture, in varying degrees, these key processes and establish a clear foundation for more realistic models that incorporate more directly the underlying physics and biology of the ocean rather than relying on simple parametric schemes. The purpose of this paper is more pedagogical than purely scientific. The problems encountered by current attempts to understand the global carbon cycle not only require our efforts but set a demand for a new generation of scientist, and it is hoped that this paper and the text in which it appears will help in this development.

  3. Mammalian cell culture process for monoclonal antibody production: nonlinear modelling and parameter estimation.

    PubMed

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad; Roman, Monica

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies.

  4. Mammalian Cell Culture Process for Monoclonal Antibody Production: Nonlinear Modelling and Parameter Estimation

    PubMed Central

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies. PMID:25685797

  5. Logistical constraints lead to an intermediate optimum in outbreak response vaccination

    PubMed Central

    Shea, Katriona; Ferrari, Matthew

    2018-01-01

    Dynamic models in disease ecology have historically evaluated vaccination strategies under the assumption that they are implemented homogeneously in space and time. However, this approach fails to formally account for operational and logistical constraints inherent in the distribution of vaccination to the population at risk. Thus, feedback between the dynamic processes of vaccine distribution and transmission might be overlooked. Here, we present a spatially explicit, stochastic Susceptible-Infected-Recovered-Vaccinated model that highlights the density-dependence and spatial constraints of various diffusive strategies of vaccination during an outbreak. The model integrates an agent-based process of disease spread with a partial differential process of vaccination deployment. We characterize the vaccination response in terms of a diffusion rate that describes the distribution of vaccination to the population at risk from a central location. This generates an explicit trade-off between slow diffusion, which concentrates effort near the central location, and fast diffusion, which spreads a fixed vaccination effort thinly over a large area. We use stochastic simulation to identify the optimum vaccination diffusion rate as a function of population density, interaction scale, transmissibility, and vaccine intensity. Our results show that, conditional on a timely response, the optimal strategy for minimizing outbreak size is to distribute vaccination resource at an intermediate rate: fast enough to outpace the epidemic, but slow enough to achieve local herd immunity. If the response is delayed, however, the optimal strategy for minimizing outbreak size changes to a rapidly diffusive distribution of vaccination effort. The latter may also result in significantly larger outbreaks, thus suggesting a benefit of allocating resources to timely outbreak detection and response. PMID:29791432

  6. NASA Iced Aerodynamics and Controls Current Research

    NASA Technical Reports Server (NTRS)

    Addy, Gene

    2009-01-01

    This slide presentation reviews the state of current research in the area of aerodynamics and aircraft control with ice conditions by the Aviation Safety Program, part of the Integrated Resilient Aircraft Controls Project (IRAC). Included in the presentation is a overview of the modeling efforts. The objective of the modeling is to develop experimental and computational methods to model and predict aircraft response during adverse flight conditions, including icing. The Aircraft icing modeling efforts includes the Ice-Contaminated Aerodynamics Modeling, which examines the effects of ice contamination on aircraft aerodynamics, and CFD modeling of ice-contaminated aircraft aerodynamics, and Advanced Ice Accretion Process Modeling which examines the physics of ice accretion, and works on computational modeling of ice accretions. The IRAC testbed, a Generic Transport Model (GTM) and its use in the investigation of the effects of icing on its aerodynamics is also reviewed. This has led to a more thorough understanding and models, both theoretical and empirical of icing physics and ice accretion for airframes, advanced 3D ice accretion prediction codes, CFD methods for iced aerodynamics and better understanding of aircraft iced aerodynamics and its effects on control surface effectiveness.

  7. A study on nonlinear estimation of submaximal effort tolerance based on the generalized MET concept and the 6MWT in pulmonary rehabilitation

    PubMed Central

    Szczegielniak, Jan; Łuniewski, Jacek; Stanisławski, Rafał; Bogacz, Katarzyna; Krajczy, Marcin; Rydel, Marek

    2018-01-01

    Background The six-minute walk test (6MWT) is considered to be a simple and inexpensive tool for the assessment of functional tolerance of submaximal effort. The aim of this work was 1) to background the nonlinear nature of the energy expenditure process due to physical activity, 2) to compare the results/scores of the submaximal treadmill exercise test and those of 6MWT in pulmonary patients and 3) to develop nonlinear mathematical models relating the two. Methods The study group included patients with the COPD. All patients were subjected to a submaximal exercise test and a 6MWT. To develop an optimal mathematical solution and compare the results of the exercise test and the 6MWT, the least squares and genetic algorithms were employed to estimate parameters of polynomial expansion and piecewise linear models. Results Mathematical analysis enabled to construct nonlinear models for estimating the MET result of submaximal exercise test based on average walk velocity (or distance) in the 6MWT. Conclusions Submaximal effort tolerance in COPD patients can be effectively estimated from new, rehabilitation-oriented, nonlinear models based on the generalized MET concept and the 6MWT. PMID:29425213

  8. HIV/AIDS interventions in an aging U.S. population.

    PubMed

    Jacobson, Stephanie A

    2011-05-01

    According to the Centers for Disease Control and Prevention (CDC), 25 percent of people living with HIV in the United States in 2006 were age 50 and older. HIV prevention for people over 50 is an important health concern, especially as the U.S. population grows older. Scholarly research has identified the need for HIV/AIDS interventions in the population of people over age 50, but few interventions have been established. The ecological perspective, which integrates intrapersonal, interpersonal, organizational, community, and policy factors, was used to review the current interventions and propose possible new HIV/AIDS prevention efforts for older adults. Intrapersonal interventions are often based on the health belief model. The precaution adoption process model was explored as an alternative intrapersonal theory for modeling prevention efforts. Community interventions using diffusion of innovations theory are fully explored, and new interventions are proposed as an option for preventing HIV/AIDS in older adults. An agenda for future research and interventions is proposed. Social workers will be at the forefront of the effort to prevent HIV/AIDS in older adults. They must accept this responsibility, propose interventions, and evaluate their effectiveness.

  9. Four decades of modeling methane cycling in terrestrial ecosystems: Where we are heading?

    NASA Astrophysics Data System (ADS)

    Xu, X.; Yuan, F.; Hanson, P. J.; Wullschleger, S. D.; Thornton, P. E.; Tian, H.; Riley, W. J.; Song, X.; Graham, D. E.; Song, C.

    2015-12-01

    A modeling approach to methane (CH4) is widely used to quantify the budget, investigate spatial and temporal variabilities, and understand the mechanistic processes and environmental controls on CH4 fluxes across spatial and temporal scales. Moreover, CH4 models are an important tool for integrating CH4 data from multiple sources, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. We reviewed 39 terrestrial CH4 models to characterize their strengths and weaknesses and to design a roadmap for future model improvement and application. We found that: (1) the focus of CH4 models have been shifted from theoretical to site- to regional-level application over the past four decades, expressed as dramatic increases in CH4 model development on regional budget quantification; (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls; (3) significant data-model and model-model mismatches are partially attributed to different representations of wetland characterization and inundation dynamics. Three efforts should be paid special attention for future improvements and applications of fully mechanistic CH4 models: (1) CH4 models should be improved to represent the mechanisms underlying land-atmosphere CH4 exchange, with emphasis on improving and validating individual CH4 processes over depth and horizontal space; (2) models should be developed that are capable of simulating CH4 fluxes across space and time (particularly hot moments and hot spots); (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. A newly developed microbial functional group-based CH4 model (CLM-Microbe) was further used to demonstrate the features of mechanistic representation and integration with multiple source of observational datasets.

  10. Students' motivational processes and their relationship to teacher ratings in school physical education: a self-determination theory approach.

    PubMed

    Standage, Martyn; Duda, Joan L; Ntoumanis, Nikos

    2006-03-01

    In the present study, we used a model of motivation grounded in self-determination theory (Deci & Ryan, 1985, 1991; Ryan & Deci, 2000a, 2000b, 2002) to examine the relationship between physical education (PE) students' motivational processes and ratings of their effort and persistence as provided by their PE teacher. Data were obtained from 394 British secondary school students (204 boys, 189 girls, 1 gender not specified; M age = 11.97 years; SD = .89; range = 11-14 years) who responded to a multisection inventory (tapping autonomy-support, autonomy, competence, relatedness, and self-determined motivation). The students' respective PE teachers subsequently provided ratings reflecting the effort and persistence each student exhibited in their PE classes. The hypothesized relationships among the study variables were examined via structural equation modeling analysis using latent factors. Results of maximum likelihood analysis using the bootstrapping method revealed the proposed model demonstrated a good fit to the data, chi-squared (292) = 632.68, p < .001; comparative fit index = .95; incremental fit index = .95, standardized root mean square residual = .077; root mean square error of approximation (RMSEA) = .054 (90% confidence interval of RMSEA = .049 -.060). Specifically, the model showed that students who perceived an autonomy supportive environment experienced greater levels of autonomy, competence, and relatedness and had higher scores on an index of self-determination. Student-reported levels of self-determined motivation positively predicted teacher ratings of effort and persistence in PE. The findings are discussed with regard to enhancing student motivation in PE settings.

  11. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  12. Low cost silicon solar array project large area silicon sheet task: Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Growth configurations were developed which produced crystals having low residual stress levels. The properties of a 106 mm diameter round crucible were evaluated and it was found that this design had greatly enhanced temperature fluctuations arising from convection in the melt. Thermal modeling efforts were directed to developing finite element models of the 106 mm round crucible and an elongated susceptor/crucible configuration. Also, the thermal model for the heat loss modes from the dendritic web was examined for guidance in reducing the thermal stress in the web. An economic analysis was prepared to evaluate the silicon web process in relation to price goals.

  13. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-07

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  14. A Petri-net coordination model for an intelligent mobile robot

    NASA Technical Reports Server (NTRS)

    Wang, F.-Y.; Kyriakopoulos, K. J.; Tsolkas, A.; Saridis, G. N.

    1990-01-01

    The authors present a Petri net model of the coordination level of an intelligent mobile robot system (IMRS). The purpose of this model is to specify the integration of the individual efforts on path planning, supervisory motion control, and vision systems that are necessary for the autonomous operation of the mobile robot in a structured dynamic environment. This is achieved by analytically modeling the various units of the system as Petri net transducers and explicitly representing the task precedence and information dependence among them. The model can also be used to simulate the task processing and to evaluate the efficiency of operations and the responsibility of decisions in the coordination level of the IMRS. Some simulation results on the task processing and learning are presented.

  15. Growing up and role modeling: a theory in Iranian nursing students' education.

    PubMed

    Mokhtari Nouri, Jamileh; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid

    2014-11-16

    One of the key strategies in students' learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors' experiences about role modeling process. Data was analyzed by Glaserian's Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions with 20 nursing students based on purposive and theoretical sampling was done for explaining role modeling process from four nursing faculties in Tehran. Through basic coding, an effort to comprehensive growth and excellence was made with the basic social process consisting the core category and through selective coding three phases were identified as: realizing and exposure to inadequate human and professional growth, facilitating human and professional growth and evolution. The role modeling process is taking place unconscious, involuntary, dynamic and with positive progressive process in order to facilitate overall growth in nursing student. Accordingly, the design and implementation of the designed model can be used to make this unconscious to conscious, active and voluntarily processes a process to help education administrators of nursing colleges and supra organization to prevent threats to human and professional in nursing students' education and promote nursing students' growth.

  16. Modelling of evaporation of a dispersed liquid component in a chemically active gas flow

    NASA Astrophysics Data System (ADS)

    Kryukov, V. G.; Naumov, V. I.; Kotov, V. Yu.

    1994-01-01

    A model has been developed to investigate evaporation of dispersed liquids in chemically active gas flow. Major efforts have been directed at the development of algorithms for implementing this model. The numerical experiments demonstrate that, in the boundary layer, significant changes in the composition and temperature of combustion products take place. This gives the opportunity to more correctly model energy release processes in combustion chambers of liquid-propellant rocket engines, gas-turbine engines, and other power devices.

  17. On the Need to Establish an International Soil Modeling Consortium

    NASA Astrophysics Data System (ADS)

    Vereecken, H.; Vanderborght, J.; Schnepf, A.

    2014-12-01

    Soil is one of the most critical life-supporting compartments of the Biosphere. Soil provides numerous ecosystem services such as a habitat for biodiversity, water and nutrients, as well as producing food, feed, fiber and energy. To feed the rapidly growing world population in 2050, agricultural food production must be doubled using the same land resources footprint. At the same time, soil resources are threatened due to improper management and climate change. Despite the many important functions of soil, many fundamental knowledge gaps remain, regarding the role of soil biota and biodiversity on ecosystem services, the structure and dynamics of soil communities, the interplay between hydrologic and biotic processes, the quantification of soil biogeochemical processes and soil structural processes, the resilience and recovery of soils from stress, as well as the prediction of soil development and the evolution of soils in the landscape, to name a few. Soil models have long played an important role in quantifying and predicting soil processes and related ecosystem services. However, a new generation of soil models based on a whole systems approach comprising all physical, mechanical, chemical and biological processes is now required to address these critical knowledge gaps and thus contribute to the preservation of ecosystem services, improve our understanding of climate-change-feedback processes, bridge basic soil science research and management, and facilitate the communication between science and society. To meet these challenges an international community effort is required, similar to initiatives in systems biology, hydrology, and climate and crop research. Our consortium will bring together modelers and experimental soil scientists at the forefront of new technologies and approaches to characterize soils. By addressing these aims, the consortium will contribute to improve the role of soil modeling as a knowledge dissemination instrument in addressing key global issues and stimulate the development of translational research activities. This presentation will provide a compelling case for this much-needed effort, with a focus on tangible benefits to the scientific and food security communities.

  18. Activities in support of the wax-impregnated wallboard concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kedl, R.J.; Stovall, T.K.

    1989-01-01

    The concept of octadecane wax impregnated wallboard for the passive solar application is a major thrust of the Oak Ridge National Laboratory (ORNL) Thermal Energy Storage (TES) program. Thus, ORNL has initiated a number of internal efforts in support of this concept. The results of these efforts are: The immersion process for filling wallboard with wax has been successfully sealed up from small samples to full-size sheets; analysis shows that the immersion process has the potential for achieving higher storage capacity than adding wax filled pellets to wallboard during its manufacture; analysis indicates that 75/degree/F is close to an optimummore » phase change temperature for the non-passive solar application; and the thermal conductivity of wallboard without wax has been measured and will be measured for wax impregnated wallboard. In addition, efforts are underway to confirm an analytical model that handles phase change wallboard for the passive solar application. 4 refs., 10 figs.« less

  19. Modeling non-point source pollutants in the vadose zone: Back to the basics

    NASA Astrophysics Data System (ADS)

    Corwin, Dennis L.; Letey, John, Jr.; Carrillo, Marcia L. K.

    More than ever before in the history of scientific investigation, modeling is viewed as a fundamental component of the scientific method because of the relatively recent development of the computer. No longer must the scientific investigator be confined to artificially isolated studies of individual processes that can lead to oversimplified and sometimes erroneous conceptions of larger phenomena. Computer models now enable scientists to attack problems related to open systems such as climatic change, and the assessment of environmental impacts, where the whole of the interactive processes are greater than the sum of their isolated components. Environmental assessment involves the determination of change of some constituent over time. This change can be measured in real time or predicted with a model. The advantage of prediction, like preventative medicine, is that it can be used to alter the occurrence of potentially detrimental conditions before they are manifest. The much greater efficiency of preventative, rather than remedial, efforts strongly justifies the need for an ability to accurately model environmental contaminants such as non-point source (NPS) pollutants. However, the environmental modeling advances that have accompanied computer technological development are a mixed blessing. Where once we had a plethora of discordant data without a holistic theory, now the pendulum has swung so that we suffer from a growing stockpile of models of which a significant number have never been confirmed or even attempts made to confirm them. Modeling has become an end in itself rather than a means because of limited research funding, the high cost of field studies, limitations in time and patience, difficulty in cooperative research and pressure to publish papers as quickly as possible. Modeling and experimentation should be ongoing processes that reciprocally enhance one another with sound, comprehensive experiments serving as the building blocks of models and models serving to economize experimental designs and directing objectives. The responsibility lies in the hands of modelers to adhere to the modeling process and to seek out experimentalists that can evaluate their model. Even though this warning is nothing new, the effort by modelers to heed it is still as much the exception as the rule.

  20. Disentangling sampling and ecological explanations underlying species-area relationships

    USGS Publications Warehouse

    Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Alpizar-Jara, R.; Flather, C.H.

    2002-01-01

    We used a probabilistic approach to address the influence of sampling artifacts on the form of species-area relationships (SARs). We developed a model in which the increase in observed species richness is a function of sampling effort exclusively. We assumed that effort depends on area sampled, and we generated species-area curves under that model. These curves can be realistic looking. We then generated SARs from avian data, comparing SARs based on counts with those based on richness estimates. We used an approach to estimation of species richness that accounts for species detection probability and, hence, for variation in sampling effort. The slopes of SARs based on counts are steeper than those of curves based on estimates of richness, indicating that the former partly reflect failure to account for species detection probability. SARs based on estimates reflect ecological processes exclusively, not sampling processes. This approach permits investigation of ecologically relevant hypotheses. The slope of SARs is not influenced by the slope of the relationship between habitat diversity and area. In situations in which not all of the species are detected during sampling sessions, approaches to estimation of species richness integrating species detection probability should be used to investigate the rate of increase in species richness with area.

  1. Identifying factors relevant in the assessment of return-to-work efforts in employees on long-term sickness absence due to chronic low back pain: a focus group study

    PubMed Central

    2012-01-01

    Background Efforts undertaken during the return to work (RTW) process need to be sufficient to prevent unnecessary applications for disability benefits. The purpose of this study was to identify factors relevant to RTW Effort Sufficiency (RTW-ES) in cases of sick-listed employees with chronic low back pain (CLBP). Methods Using focus groups consisting of Labor Experts (LE's) working at the Dutch Social Insurance Institute, arguments and underlying grounds relevant to the assessment of RTW-ES were investigated. Factors were collected and categorized using the International Classification of Functioning, Disability and Health (ICF model). Results Two focus groups yielded 19 factors, of which 12 are categorized in the ICF model under activities (e.g. functional capacity) and in the personal (e.g. age, tenure) and environmental domain (e.g. employer-employee relationship). The remaining 7 factors are categorized under intervention, job accommodation and measures. Conclusions This focus group study shows that 19 factors may be relevant to RTW-ES in sick-listed employees with CLBP. Providing these results to professionals assessing RTW-ES might contribute to a more transparent and systematic approach. Considering the importance of the quality of the RTW process, optimizing the RTW-ES assessment is essential. PMID:22272831

  2. Need for recovery from work and sleep-related complaints among nursing professionals.

    PubMed

    Silva-Costa, Aline; Griep, Rosane Harter; Fischer, Frida Marina; Rotenberg, Lúcia

    2012-01-01

    The concept of need for recovery from work (NFR) was deduced from the effort recuperation model. In this model work produces costs in terms of effort during the working day. When there is enough time and possibilities to recuperate, a worker will arrive at the next working day with no residual symptoms of previous effort. NFR evaluates work characteristics such as psychosocial demands, professional work hours or schedules. However, sleep may be an important part of the recovery process. The aim of the study was to test the association between sleep-related complaints and NFR. A cross-sectional study was carried out at three hospitals. All females nursing professionals engaged in assistance to patients were invited to participate (N = 1,307). Participants answered a questionnaire that included four sleep-related complaints (insomnia, unsatisfactory sleep, sleepiness during work hours and insufficient sleep), work characteristics and NRF scale. Binomial logistic regression analysis showed that all sleep-related complaints are associated with a high need for recovery from work. Those who reported insufficient sleep showed a greater chance of high need for recovery; OR=2.730 (CI 95% 2.074 - 3.593). These results corroborate the hypothesis that sleep is an important aspect of the recovery process and, therefore, should be thoroughly investigated.

  3. ECUT: Energy Conversion and Utilization Technologies program. Chemical Processes project report, FY 1982

    NASA Technical Reports Server (NTRS)

    Wilcox, R. E. (Compiler)

    1983-01-01

    Planned research efforts and reorganization of the Project as the Biocatalysis Research Activity are described, including the following topics: electrocatalysts, fluid extraction, ammonia synthesis, biocatalysis, membrane fouling, energy and economic analysis, decarboxylation, microscopic reaction models, plasmid monitoring, and reaction kinetics.

  4. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Quantifying control effort of biological and technical movements: an information-entropy-based approach.

    PubMed

    Haeufle, D F B; Günther, M; Wunner, G; Schmitt, S

    2014-01-01

    In biomechanics and biorobotics, muscles are often associated with reduced movement control effort and simplified control compared to technical actuators. This is based on evidence that the nonlinear muscle properties positively influence movement control. It is, however, open how to quantify the simplicity aspect of control effort and compare it between systems. Physical measures, such as energy consumption, stability, or jerk, have already been applied to compare biological and technical systems. Here a physical measure of control effort based on information entropy is presented. The idea is that control is simpler if a specific movement is generated with less processed sensor information, depending on the control scheme and the physical properties of the systems being compared. By calculating the Shannon information entropy of all sensor signals required for control, an information cost function can be formulated allowing the comparison of models of biological and technical control systems. Exemplarily applied to (bio-)mechanical models of hopping, the method reveals that the required information for generating hopping with a muscle driven by a simple reflex control scheme is only I=32 bits versus I=660 bits with a DC motor and a proportional differential controller. This approach to quantifying control effort captures the simplicity of a control scheme and can be used to compare completely different actuators and control approaches.

  6. Influence of Microstructure Representation on Flow Stress and Grain Size Prediction in Through-Process Modeling of AA5182 Sheet Production

    NASA Astrophysics Data System (ADS)

    Lohmar, Johannes; Bambach, Markus; Karhausen, Kai F.

    2013-01-01

    Integrated computational materials engineering is an up to date method for developing new materials and optimizing complete process chains. In the simulation of a process chain, material models play a central role as they capture the response of the material to external process conditions. While much effort is put into their development and improvement, less attention is paid to their implementation, which is problematic because the representation of microstructure in the model has a decisive influence on modeling accuracy and calculation speed. The aim of this article is to analyze the influence of different microstructure representation concepts on the prediction of flow stress and microstructure evolution when using the same set of material equations. Scalar, tree-based and cluster-based concepts are compared for a multi-stage rolling process of an AA5182 alloy. It was found that implementation influences the predicted flow stress and grain size, in particular in the regime of coupled hardening and softening.

  7. Sizing the science data processing requirements for EOS

    NASA Technical Reports Server (NTRS)

    Wharton, Stephen W.; Chang, Hyo D.; Krupp, Brian; Lu, Yun-Chi

    1991-01-01

    The methodology used in the compilation and synthesis of baseline science requirements associated with the 30 + EOS (Earth Observing System) instruments and over 2,400 EOS data products (both output and required input) proposed by EOS investigators is discussed. A brief background on EOS and the EOS Data and Information System (EOSDIS) is presented, and the approach is outlined in terms of a multilayer model. The methodology used to compile, synthesize, and tabulate requirements within the model is described. The principal benefit of this approach is the reduction of effort needed to update the analysis and maintain the accuracy of the science data processing requirements in response to changes in EOS platforms, instruments, data products, processing center allocations, or other model input parameters. The spreadsheets used in the model provide a compact representation, thereby facilitating review and presentation of the information content.

  8. System Engineering Concept Demonstration, Effort Summary. Volume 1

    DTIC Science & Technology

    1992-12-01

    involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This

  9. DoD Acquisitions Reform: Embracing and Implementing Agile

    DTIC Science & Technology

    2015-12-01

    required in the traditional waterfall approach.   Specific models for enterprise level efforts include Scaled Agile Framework, Disciplined Agile...and Acquisition Concerns. Pittsburgh: Carnegie Mellon.  Leffingwell, D. (2007). Why The Waterfall Model Doesn’t Work. In D. Leffingwell, Scaling...serious issue might be the acquisitions process itself. For the last twenty plus years, the Air Force has used the waterfall approach for software

  10. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  11. Modeling the effects of argument length and validity on inductive and deductive reasoning.

    PubMed

    Rotello, Caren M; Heit, Evan

    2009-09-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.

  12. The Influence of Chronic Ego Depletion on Goal Adherence: An Experience Sampling Study.

    PubMed

    Wang, Ligang; Tao, Ting; Fan, Chunlei; Gao, Wenbin; Wei, Chuguang

    2015-01-01

    Although ego depletion effects have been widely observed in experiments in which participants perform consecutive self-control tasks, the process of ego depletion remains poorly understood. Using the strength model of self-control, we hypothesized that chronic ego depletion adversely affects goal adherence and that mental effort and motivation are involved in the process of ego depletion. In this study, 203 students reported their daily performance, mental effort, and motivation with respect to goal directed behavior across a 3-week time period. People with high levels of chronic ego depletion were less successful in goal adherence than those with less chronic ego depletion. Although daily effort devoted to goal adherence increased with chronic ego depletion, motivation to adhere to goals was not affected. Participants with high levels of chronic ego depletion showed a stronger positive association between mental effort and performance, but chronic ego depletion did not play a regulatory role in the effect of motivation on performance. Chronic ego depletion increased the likelihood of behavior regulation failure, suggesting that it is difficult for people in an ego-depletion state to adhere to goals. We integrate our results with the findings of previous studies and discuss possible theoretical implications.

  13. The Influence of Chronic Ego Depletion on Goal Adherence: An Experience Sampling Study

    PubMed Central

    Wang, Ligang; Tao, Ting; Fan, Chunlei; Gao, Wenbin; Wei, Chuguang

    2015-01-01

    Although ego depletion effects have been widely observed in experiments in which participants perform consecutive self-control tasks, the process of ego depletion remains poorly understood. Using the strength model of self-control, we hypothesized that chronic ego depletion adversely affects goal adherence and that mental effort and motivation are involved in the process of ego depletion. In this study, 203 students reported their daily performance, mental effort, and motivation with respect to goal directed behavior across a 3-week time period. People with high levels of chronic ego depletion were less successful in goal adherence than those with less chronic ego depletion. Although daily effort devoted to goal adherence increased with chronic ego depletion, motivation to adhere to goals was not affected. Participants with high levels of chronic ego depletion showed a stronger positive association between mental effort and performance, but chronic ego depletion did not play a regulatory role in the effect of motivation on performance. Chronic ego depletion increased the likelihood of behavior regulation failure, suggesting that it is difficult for people in an ego-depletion state to adhere to goals. We integrate our results with the findings of previous studies and discuss possible theoretical implications. PMID:26562839

  14. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    PubMed

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  15. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  16. Device research task (processing and high-efficiency solar cells)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    This task has been expanded since the last 25th Project Integration Meeting (PIM) to include process research in addition to device research. The objective of this task is to assist the Flat-plate Solar Array (FSA) Project in meeting its near- and long-term goals by identifying and implementing research in the areas of device physics, device structures, measurement techniques, material-device interactions, and cell processing. The research efforts of this task are described and reflect the deversity of device research being conducted. All of the contracts being reported are either completed or near completion and culminate the device research efforts of the FSA Project. Optimazation methods and silicon solar cell numerical models, carrier transport and recombination parameters in heavily doped silicon, development and analysis of silicon solar cells of near 20% efficiency, and SiN sub x passivation of silicon surfaces are discussed.

  17. Model Calibration Efforts for the International Space Station's Solar Array Mast

    NASA Technical Reports Server (NTRS)

    Elliott, Kenny B.; Horta, Lucas G.; Templeton, Justin D.; Knight, Norman F., Jr.

    2012-01-01

    The International Space Station (ISS) relies on sixteen solar-voltaic blankets to provide electrical power to the station. Each pair of blankets is supported by a deployable boom called the Folding Articulated Square Truss Mast (FAST Mast). At certain ISS attitudes, the solar arrays can be positioned in such a way that shadowing of either one or three longerons causes an unexpected asymmetric thermal loading that if unchecked can exceed the operational stability limits of the mast. Work in this paper documents part of an independent NASA Engineering and Safety Center effort to assess the existing operational limits. Because of the complexity of the system, the problem is being worked using a building-block progression from components (longerons), to units (single or multiple bays), to assembly (full mast). The paper presents results from efforts to calibrate the longeron components. The work includes experimental testing of two types of longerons (straight and tapered), development of Finite Element (FE) models, development of parameter uncertainty models, and the establishment of a calibration and validation process to demonstrate adequacy of the models. Models in the context of this paper refer to both FE model and probabilistic parameter models. Results from model calibration of the straight longerons show that the model is capable of predicting the mean load, axial strain, and bending strain. For validation, parameter values obtained from calibration of straight longerons are used to validate experimental results for the tapered longerons.

  18. Development of pulsed processes for the manufacture of solar cells

    NASA Technical Reports Server (NTRS)

    Minnucci, J. A.

    1978-01-01

    The results of a 1-year program to develop the processes required for low-energy ion implantation for the automated production of silicon solar cells are described. The program included: (1) demonstrating state-of-the-art ion implantation equipment and designing an automated ion implanter, (2) making efforts to improve the performance of ion-implanted solar cells to 16.5 percent AM1, (3) developing a model of the pulse annealing process used in solar cell production, and (4) preparing an economic analysis of the process costs of ion implantation.

  19. Oxidation reactions of solid carbonaceous and resinous substances in supercritical water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koda, S.

    Recent kinetic studies, particularly those by means of shadowgraphy and X-ray radiography, for supercritical water oxidation of solid carbonaceous and resinous substances have revealed the importance of the O{sub 2} mass transfer process over the intrinsic surface reaction at higher temperatures. The mass transfer processes, internal and external one, should be incorporated in designing SCWO processes for solid substances and related processes such as catalytic SCWO. Some model calculation efforts of late are briefly described. Finally, fundamental information required for future development is itemed.

  20. The process audit.

    PubMed

    Hammer, Michael

    2007-04-01

    Few executives question the idea that by redesigning business processes--work that runs from end to end across an enterprise--they can achieve extraordinary improvements in cost, quality, speed, profitability, and other key areas Yet in spite of their intentions and investments, many executives flounder, unsure about what exactly needs to be changed, by how much, and when. As a result, many organizations make little progress--if any at all--in their attempts to transform business processes. Michael Hammer has spent the past five years working with a group of leading companies to develop the Process and Enterprise Maturity Model (PEMM), a new framework that helps executives comprehend, formulate, and assess process-based transformation efforts. He has identified two distinct groups of characteristics that are needed for business processes to perform exceptionally well over a long period of time. Process enablers, which affect individual processes, determine how well a process is able to function. They are mutually interdependent--if any are missing, the others will be ineffective. However, enablers are not enough to develop high-performance processes; they only provide the potential to deliver high performance. A company must also possess or establish organizational capabilities that allow the business to offer a supportive environment. Together, the enablers and the capabilities provide an effective way for companies to plan and evaluate process-based transformations. PEMM is different from other frameworks, such as Capability Maturity Model Integration (CMMI), because it applies to all industries and all processes. The author describes how several companies--including Michelin, CSAA, Tetra Pak, Shell, Clorox, and Schneider National--have successfully used PEMM in various ways and at different stages to evaluate the progress of their process-based transformation efforts.

  1. Long-term care information systems: an overview of the selection process.

    PubMed

    Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara

    2006-06-01

    Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.

  2. Interactive Computing and Processing of NASA Land Surface Observations Using Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Bell, Jordan

    2016-01-01

    Google's Earth Engine offers a "big data" approach to processing large volumes of NASA and other remote sensing products. h\\ps://earthengine.google.com/ Interfaces include a Javascript or Python-based API, useful for accessing and processing over large periods of record for Landsat and MODIS observations. Other data sets are frequently added, including weather and climate model data sets, etc. Demonstrations here focus on exploratory efforts to perform land surface change detection related to severe weather, and other disaster events.

  3. An XML-Based Manipulation and Query Language for Rule-Based Information

    NASA Astrophysics Data System (ADS)

    Mansour, Essam; Höpfner, Hagen

    Rules are utilized to assist in the monitoring process that is required in activities, such as disease management and customer relationship management. These rules are specified according to the application best practices. Most of research efforts emphasize on the specification and execution of these rules. Few research efforts focus on managing these rules as one object that has a management life-cycle. This paper presents our manipulation and query language that is developed to facilitate the maintenance of this object during its life-cycle and to query the information contained in this object. This language is based on an XML-based model. Furthermore, we evaluate the model and language using a prototype system applied to a clinical case study.

  4. Multi-Dimensional Calibration of Impact Dynamic Models

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Annett, Martin S.; Jackson, Karen E.

    2011-01-01

    NASA Langley, under the Subsonic Rotary Wing Program, recently completed two helicopter tests in support of an in-house effort to study crashworthiness. As part of this effort, work is on-going to investigate model calibration approaches and calibration metrics for impact dynamics models. Model calibration of impact dynamics problems has traditionally assessed model adequacy by comparing time histories from analytical predictions to test at only a few critical locations. Although this approach provides for a direct measure of the model predictive capability, overall system behavior is only qualitatively assessed using full vehicle animations. In order to understand the spatial and temporal relationships of impact loads as they migrate throughout the structure, a more quantitative approach is needed. In this work impact shapes derived from simulated time history data are used to recommend sensor placement and to assess model adequacy using time based metrics and orthogonality multi-dimensional metrics. An approach for model calibration is presented that includes metric definitions, uncertainty bounds, parameter sensitivity, and numerical optimization to estimate parameters to reconcile test with analysis. The process is illustrated using simulated experiment data.

  5. Ultrafast Spectroscopy of Mid-Infrared Semiconductors Using the Signal and Idler Beams of a Synchronous Optical Parametric Oscillator

    DTIC Science & Technology

    2008-03-01

    then used to fit theoretical models describing radiative and non-radiative relaxation processes. 3.2 Experimental Setup This thesis uses a mode...Russian Efforts. Master’s thesis, Naval Postgraduate School, 2005. 5. Chirsto, Farid C. “Thermochemistry and Kinetics Models for MagnesiumTe- flon/Viton...Coherent Mira Model 900-F Laser. 7. Cooley, William T. Measurement of Ultrafast Carrier Recombination Dynamics in Mid-Infrared Semiconductor Laser Material

  6. Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes

    DTIC Science & Technology

    2015-09-30

    goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the

  7. Improving waterfowl production estimates: Results of a test in the prairie pothole region

    USGS Publications Warehouse

    Arnold, P.M.; Cowardin, L.M.

    1985-01-01

    The U.S. Fish and Wildlife Service in an effort to improve and standardize methods for estimating waterfowl production tested a new technique in the four-county Arrowwood Wetland Management District (WMD) for three years (1982-1984). On 14 randomly selected 10.36 km2 plots, upland and wetland habitat was mapped, classified, and digitized. Waterfowl breeding pairs were counted twice each year and the proportion of wetland basins containing water was determined. Pair numbers and habitat conditions were entered into a computer model developed by Northern Prairie Wildlife Research Center. That model estimates production on small federally owned wildlife tracts, federal wetland easements, and private land. Results indicate that production estimates were most accurate for mallards (Anas platyrhynchos), the species for which the computer model and data base were originally designed. Predictions for the pintail (Anas acuta), gadwall (A. strepa), blue-winged teal (A. discors), and northern shoveler (A. clypeata) were believed to be less accurate. Modeling breeding period dynamics of a waterfowl species and making credible production estimates for a geographic area are possible if the data used in the model are adequate. The process of modeling the breeding period of a species aids in locating areas of insufficient biological knowledge. This process will help direct future research efforts and permit more efficient gathering of field data.

  8. Miss-distance indicator for tank main guns

    NASA Astrophysics Data System (ADS)

    Bornstein, Jonathan A.; Hillis, David B.

    1996-06-01

    Tank main gun systems must possess extremely high levels of accuracy to perform successfully in battle. Under some circumstances, the first round fired in an engagement may miss the intended target, and it becomes necessary to rapidly correct fire. A breadboard automatic miss-distance indicator system was previously developed to assist in this process. The system, which would be mounted on a 'wingman' tank, consists of a charged-coupled device (CCD) camera and computer-based image-processing system, coupled with a separate infrared sensor to detect muzzle flash. For the system to be successfully employed with current generation tanks, it must be reliable, be relatively low cost, and respond rapidly maintaining current firing rates. Recently, the original indicator system was developed further in an effort to assist in achieving these goals. Efforts have focused primarily upon enhanced image-processing algorithms, both to improve system reliability and to reduce processing requirements. Intelligent application of newly refined trajectory models has permitted examination of reduced areas of interest and enhanced rejection of false alarms, significantly improving system performance.

  9. One NASA: Sharing Knowledge Through an Agency-wide Process Asset Library (PAL)

    NASA Technical Reports Server (NTRS)

    Truss, Baraka J.

    2006-01-01

    This poster session will cover the key purpose and components behind implementing the NASA PAL website. This session will present the current results, describing the process used to create the website, the current usage measure, and will demonstrate how NASA is truly becoming ONE. The target audience for the poster session includes those currently implementing the CMMI model and looking for PAL adoption techniques. To continue to be the leader in space, science and technology, NASA is using this agency-wide PAL to share knowledge, work products and lessons learned through this website. Many organizations have failed to recognize how the efforts of process improvement fit into overall organizational effort. However, NASA as an agency has adopted the benefits of process improvement by the creation of this website to foster communication between its ten centers. The poster session will cover the following, topics outlined below: 1) Website purpose; 2) Characteristics of the website; 3) User accounts status; 4) Website content size; and 5) Usage percentages.

  10. Climate Change Modeling Needs and Efforts for Hydroelectric System Operations in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Pytlak, E.

    2014-12-01

    This presentation will outline ongoing, multi-year hydroclimate change research between the Columbia River Management Joint Operating Committee (RMJOC), The University of Washington, Portland State University, and their many regional research partners and stakeholders. Climate change in the Columbia River Basin is of particular concern to the Bonneville Power Administration (BPA) and many Federal, Tribal and regional stakeholders. BPA, the U.S. Army Corp of Engineers, and U.S. Bureau of Reclamation, which comprise the RMJOC, conducted an extensive study in 2009-11 using climate change streamflows produced by the University of Washington Climate Impacts Group (CIG). The study reconfirmed that as more winter precipitation in the Columbia Basin falls as rain rather than snow by mid-century, particularly on the U.S. portion of the basin, increased winter runoff is likely, followed by an earlier spring snowmelt peak, followed by less summer flows as seasonal snowmelt diminished earlier in the water year. Since that initial effort, both global and regional climate change modeling has advanced. To take advantage of the new outputs from the Fifth Coupled Model Intercomparison Project (CMIP-5), the RMJOC, through BPA support, is sponsoring new hydroclimate research which considers not only the most recent information from the GCMs, but also the uncertainties introduced by the hydroclimate modeling process itself. Historical streamflows, which are used to calibrate hydrologic models and ascertain their reliability, are subject to both measurement and modeling uncertainties. Downscaling GCMs to a hydrologically useful spatial and temporal resolution introduces uncertainty, depending on the downscaling methods. Hydrologic modeling introduces uncertainties from calibration and geophysical states, some of which, like land surface characteristics, are likely to also change with time. In the upper Columbia Basin, glacier processes introduce yet another source of uncertainty. The latest joint effort attempts to ascertain the relative contributions of these uncertainties in comparison to the uncertainties brought by changing climate itself.

  11. The Utility of Behavioral Economics in Expanding the Free-Feed Model of Obesity

    PubMed Central

    Rasmussen, Erin B.; Robertson, Stephen H.; Rodriguez, Luis R.

    2016-01-01

    Animal models of obesity are numerous and diverse in terms of identifying specific neural and peripheral mechanisms related to obesity; however, they are limited when it comes to behavior. The standard behavioral measure of food intake in most animal models occurs in a free-feeding environment. While easy and cost-effective for the researcher, the free-feeding environment omits some of the most important features of obesity-related food consumption—namely, properties of food availability, such as effort and delay to obtaining food. Behavior economics expands behavioral measures of obesity animal models by identifying such behavioral mechanisms. First, economic demand analysis allows researchers to understand the role of effort in food procurement, and how physiological and neural mechanisms are related. Second, studies on delay discounting contribute to a growing literature that shows that sensitivity to delayed food- and food-related outcomes is likely a fundamental process of obesity. Together, these data expand the animal model in a manner that better characterizes how environmental factors influence food consumption. PMID:26923097

  12. Safety modelling and testing of lithium-ion batteries in electrified vehicles

    NASA Astrophysics Data System (ADS)

    Deng, Jie; Bae, Chulheung; Marcicki, James; Masias, Alvaro; Miller, Theodore

    2018-04-01

    To optimize the safety of batteries, it is important to understand their behaviours when subjected to abuse conditions. Most early efforts in battery safety modelling focused on either one battery cell or a single field of interest such as mechanical or thermal failure. These efforts may not completely reflect the failure of batteries in automotive applications, where various physical processes can take place in a large number of cells simultaneously. In this Perspective, we review modelling and testing approaches for battery safety under abuse conditions. We then propose a general framework for large-scale multi-physics modelling and experimental work to address safety issues of automotive batteries in real-world applications. In particular, we consider modelling coupled mechanical, electrical, electrochemical and thermal behaviours of batteries, and explore strategies to extend simulations to the battery module and pack level. Moreover, we evaluate safety test approaches for an entire range of automotive hardware sets from cell to pack. We also discuss challenges in building this framework and directions for its future development.

  13. Improving Recruiting of the 6th Recruiting Brigade Through Statistical Analysis and Efficiency Measures

    DTIC Science & Technology

    2014-12-01

    example of maximizing or minimizing decision variables within a model. Carol Stoker and Stephen Mehay present a comparative analysis of marketing and advertising strategies...strategy development process; documenting various recruiting, marketing , and advertising initiatives in each nation; and examining efforts to

  14. Evaluate Yourself. Evaluation: Research-Based Decision Making Series, Number 9304.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    This document considers both self-examination and external evaluation of gifted and talented education programs. Principles of the self-examination process are offered, noting similarities to external evaluation models. Principles of self-evaluation efforts include the importance of maintaining a nonjudgmental orientation, soliciting views from…

  15. A New Biogeochemical Computational Framework Integrated within the Community Land Model

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.

    2012-12-01

    Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.

  16. Modeling the Gas Nitriding Process of Low Alloy Steels

    NASA Astrophysics Data System (ADS)

    Yang, M.; Zimmerman, C.; Donahue, D.; Sisson, R. D.

    2013-07-01

    The effort to simulate the nitriding process has been ongoing for the last 20 years. Most of the work has been done to simulate the nitriding process of pure iron. In the present work a series of experiments have been done to understand the effects of the nitriding process parameters such as the nitriding potential, temperature, and time as well as surface condition on the gas nitriding process for the steels. The compound layer growth model has been developed to simulate the nitriding process of AISI 4140 steel. In this paper the fundamentals of the model are presented and discussed including the kinetics of compound layer growth and the determination of the nitrogen diffusivity in the diffusion zone. The excellent agreements have been achieved for both as-washed and pre-oxided nitrided AISI 4140 between the experimental data and simulation results. The nitrogen diffusivity in the diffusion zone is determined to be constant and only depends on the nitriding temperature, which is ~5 × 10-9 cm2/s at 548 °C. It proves the concept of utilizing the compound layer growth model in other steels. The nitriding process of various steels can thus be modeled and predicted in the future.

  17. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations. © 2013 by the American College of Medical Quality.

  18. Land cover characterization and land surface parameterization research

    USGS Publications Warehouse

    Steyaert, Louis T.; Loveland, Thomas R.; Parton, William J.

    1997-01-01

    The understanding of land surface processes and their parameterization in atmospheric, hydrologic, and ecosystem models has been a dominant research theme over the past decade. For example, many studies have demonstrated the key role of land cover characteristics as controlling factors in determining land surface processes, such as the exchange of water, energy, carbon, and trace gases between the land surface and the lower atmosphere. The requirements for multiresolution land cover characteristics data to support coupled-systems modeling have also been well documented, including the need for data on land cover type, land use, and many seasonally variable land cover characteristics, such as albedo, leaf area index, canopy conductance, surface roughness, and net primary productivity. Recently, the developers of land data have worked more closely with the land surface process modelers in these efforts.

  19. Modeling Atmospheric CO2 Processes to Constrain the Missing Sink

    NASA Technical Reports Server (NTRS)

    Kawa, S. R.; Denning, A. S.; Erickson, D. J.; Collatz, J. C.; Pawson, S.

    2005-01-01

    We report on a NASA supported modeling effort to reduce uncertainty in carbon cycle processes that create the so-called missing sink of atmospheric CO2. Our overall objective is to improve characterization of CO2 source/sink processes globally with improved formulations for atmospheric transport, terrestrial uptake and release, biomass and fossil fuel burning, and observational data analysis. The motivation for this study follows from the perspective that progress in determining CO2 sources and sinks beyond the current state of the art will rely on utilization of more extensive and intensive CO2 and related observations including those from satellite remote sensing. The major components of this effort are: 1) Continued development of the chemistry and transport model using analyzed meteorological fields from the Goddard Global Modeling and Assimilation Office, with comparison to real time data in both forward and inverse modes; 2) An advanced biosphere model, constrained by remote sensing data, coupled to the global transport model to produce distributions of CO2 fluxes and concentrations that are consistent with actual meteorological variability; 3) Improved remote sensing estimates for biomass burning emission fluxes to better characterize interannual variability in the atmospheric CO2 budget and to better constrain the land use change source; 4) Evaluating the impact of temporally resolved fossil fuel emission distributions on atmospheric CO2 gradients and variability. 5) Testing the impact of existing and planned remote sensing data sources (e.g., AIRS, MODIS, OCO) on inference of CO2 sources and sinks, and use the model to help establish measurement requirements for future remote sensing instruments. The results will help to prepare for the use of OCO and other satellite data in a multi-disciplinary carbon data assimilation system for analysis and prediction of carbon cycle changes and carbodclimate interactions.

  20. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  1. MRMAide: a mixed resolution modeling aide

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; McGraw, Robert M.

    2002-07-01

    The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.

  2. Systems Analysis Initiated for All-Electric Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Kohout, Lisa L.

    2003-01-01

    A multidisciplinary effort is underway at the NASA Glenn Research Center to develop concepts for revolutionary, nontraditional fuel cell power and propulsion systems for aircraft applications. There is a growing interest in the use of fuel cells as a power source for electric propulsion as well as an auxiliary power unit to substantially reduce or eliminate environmentally harmful emissions. A systems analysis effort was initiated to assess potential concepts in an effort to identify those configurations with the highest payoff potential. Among the technologies under consideration are advanced proton exchange membrane (PEM) and solid oxide fuel cells, alternative fuels and fuel processing, and fuel storage. Prior to this effort, the majority of fuel cell analysis done at Glenn was done for space applications. Because of this, a new suite of models was developed. These models include the hydrogen-air PEM fuel cell; internal reforming solid oxide fuel cell; balance-of-plant components (compressor, humidifier, separator, and heat exchangers); compressed gas, cryogenic, and liquid fuel storage tanks; and gas turbine/generator models for hybrid system applications. Initial mass, volume, and performance estimates of a variety of PEM systems operating on hydrogen and reformate have been completed for a baseline general aviation aircraft. Solid oxide/turbine hybrid systems are being analyzed. In conjunction with the analysis efforts, a joint effort has been initiated with Glenn s Computer Services Division to integrate fuel cell stack and component models with the visualization environment that supports the GRUVE lab, Glenn s virtual reality facility. The objective of this work is to provide an environment to assist engineers in the integration of fuel cell propulsion systems into aircraft and provide a better understanding of the interaction between system components and the resulting effect on the overall design and performance of the aircraft. Initially, three-dimensional computer-aided design (CAD) models of representative PEM fuel cell stack and components were developed and integrated into the virtual reality environment along with an Excel-based model used to calculate fuel cell electrical performance on the basis of cell dimensions (see the figure). CAD models of a representative general aviation aircraft were also developed and added to the environment. With the use of special headgear, users will be able to virtually manipulate the fuel cell s physical characteristics and its placement within the aircraft while receiving information on the resultant fuel cell output power and performance. As the systems analysis effort progresses, we will add more component models to the GRUVE environment to help us more fully understand the effect of various system configurations on the aircraft.

  3. Achieving performance breakthroughs in an HMO business process through quality planning.

    PubMed

    Hanan, K B

    1993-01-01

    Kaiser Permanente's Georgia Region commissioned a quality planning team to design a new process to improve payments to its suppliers and vendors. The result of the team's effort was a 73 percent reduction in cycle time. This team's experiences point to the advantages of process redesign as a quality planning model, as well as some general guidelines for its most effective use in teams. If quality planning project teams are carefully configured, sufficiently expert in the existing process, and properly supported by management, organizations can achieve potentially dramatic improvements in process performance using this approach.

  4. Model of dissolution in the framework of tissue engineering and drug delivery.

    PubMed

    Sanz-Herrera, J A; Soria, L; Reina-Romo, E; Torres, Y; Boccaccini, A R

    2018-05-22

    Dissolution phenomena are ubiquitously present in biomaterials in many different fields. Despite the advantages of simulation-based design of biomaterials in medical applications, additional efforts are needed to derive reliable models which describe the process of dissolution. A phenomenologically based model, available for simulation of dissolution in biomaterials, is introduced in this paper. The model turns into a set of reaction-diffusion equations implemented in a finite element numerical framework. First, a parametric analysis is conducted in order to explore the role of model parameters on the overall dissolution process. Then, the model is calibrated and validated versus a straightforward but rigorous experimental setup. Results show that the mathematical model macroscopically reproduces the main physicochemical phenomena that take place in the tests, corroborating its usefulness for design of biomaterials in the tissue engineering and drug delivery research areas.

  5. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  6. Stochastic simulation by image quilting of process-based geological models

    NASA Astrophysics Data System (ADS)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  7. A neuronal model of a global workspace in effortful cognitive tasks.

    PubMed

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  8. Toxico-Cheminformatics: New and Expanding Public ...

    EPA Pesticide Factsheets

    High-throughput screening (HTS) technologies, along with efforts to improve public access to chemical toxicity information resources and to systematize older toxicity studies, have the potential to significantly improve information gathering efforts for chemical assessments and predictive capabilities in toxicology. Important developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. By annotating toxicity data with associated chemical structure information, these efforts link data across diverse study domains (e.g., ‘omics’, HTS, traditional toxicity studies), toxicity domains (carcinogenicity, developmental toxicity, neurotoxicity, immunotoxicity, etc) and database sources (EPA, FDA, NCI, DSSTox, PubChem, GEO, ArrayExpress, etc.). Public initiatives are developing systematized data models of toxicity study areas and introducing standardized templates, controlled vocabularies, hierarchical organization, and powerful relational searching capability across capt

  9. Tools for Local and Distributed Climate Data Access

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.

    2017-12-01

    Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.

  10. An Uncertainty Structure Matrix for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.

    2008-01-01

    Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.

  11. Application of the Ecosystem Diagnosis and Treatment Method to the Grande Ronde Model Watershed project : Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mobrand, Lars Erik; Lestelle, Lawrence C.

    In the spring of 1994 a technical planning support project was initiated by the Grande Ronde Model Watershed Board of Directors (Board) with funding from the Bonneville Power Administration. The project was motivated by a need for a science based method for prioritizing restoration actions in the basin that would promote effectiveness and accountability. In this section the authors recall the premises for the project. The authors also present a set of recommendations for implementing a watershed planning process that incorporates a science-based framework to help guide decision making. This process is intended to assist the Grande Ronde Model Watershedmore » Board in its effort to plan and implement watershed improvement measures. The process would also assist the Board in coordinating its efforts with other entities in the region. The planning process is based on an approach for developing an ecosystem management strategy referred to as the Ecosystem Diagnosis and Treatment (EDT) method (Lichatowich et al. 1995, Lestelle et al. 1996). The process consists of an on-going planning cycle. Included in this cycle is an assessment of the ability of the watershed to support and sustain natural resources and other economic and societal values. This step in the process, which the authors refer to as the diagnosis, helps guide the development of actions (also referred to as treatments) aimed at improving the conditions of the watershed to achieve long-term objectives. The planning cycle calls for routinely reviewing and updating, as necessary, the basis for the diagnosis and other analyses used by the Board in adopting actions for implementation. The recommendations offered here address this critical need to habitually update the information used in setting priorities for action.« less

  12. Evaluation of used fuel disposition in clay-bearing rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jove-Colon, Carlos F.; Hammond, Glenn Edward; Kuhlman, Kristopher L.

    The R&D program from the DOE Used Fuel Disposition Campaign (UFDC) has documented key advances in coupled Thermal-Hydrological-Mechanical-Chemical (THMC) modeling of clay to simulate its complex dynamic behavior in response to thermal and hydrochemical feedbacks. These efforts have been harnessed to assess the isolation performance of heat-generating nuclear waste in a deep geological repository in clay/shale/argillaceous rock formations. This report describes the ongoing disposal R&D efforts on the advancement and refinement of coupled THMC process models, hydrothermal experiments on barrier clay interactions, used fuel and canister material degradation, thermodynamic database development, and reactive transport modeling of the near-field under non-isothermalmore » conditions. These play an important role to the evaluation of sacrificial zones as part of the EBS exposure to thermally-driven chemical and transport processes. Thermal inducement of chemical interactions at EBS domains enhances mineral dissolution/precipitation but also generates mineralogical changes that result in mineral H2O uptake/removal (hydration/dehydration reactions). These processes can result in volume changes that can affect the interface / bulk phase porosities and the mechanical (stress) state of the bentonite barrier. Characterization studies on bentonite barrier samples from the FEBEX-DP international activity have provided important insight on clay barrier microstructures (e.g., microcracks) and interactions at EBS interfaces. Enhancements to the used fuel degradation model outlines the need to include the effects of canister corrosion due the strong influence of H2 generation on the source term.« less

  13. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  14. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    NASA Astrophysics Data System (ADS)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  15. The Friction Force Determination of Large-Sized Composite Rods in Pultrusion

    NASA Astrophysics Data System (ADS)

    Grigoriev, S. N.; Krasnovskii, A. N.; Kazakov, I. A.

    2014-08-01

    Nowadays, the simple pull-force models of pultrusion process are not suitable for large sized rods because they are not considered a chemical shrinkage and thermal expansion acting in cured material inside the die. But the pulling force of the resin-impregnated fibers as they travels through the heated die is essential factor in the pultrusion process. In order to minimize the number of trial-and-error experiments a new mathematical approach to determine the frictional force is presented. The governing equations of the model are stated in general terms and various simplifications are implemented in order to obtain solutions without extensive numerical efforts. The influence of different pultrusion parameters on the frictional force value is investigated. The results obtained by the model can establish a foundation by which process control parameters are selected to achieve an appropriate pull-force and can be used for optimization pultrusion process.

  16. Development of a Modeling Framework to Support Control Investigations of Sailcraft Missions A First Cut: ABLE Sailcraft Dynamics Model

    NASA Technical Reports Server (NTRS)

    Sarathy, Sriprakash

    2005-01-01

    Solar Sailcraft, the stuff of dreams of the H.G. Wells generation, is now a rapidly maturing reality. The promise of unlimited propulsive power by harnessing stellar radiation is close to realization. Currently, efforts are underway to build, prototype and test two configurations. These sails are designed to meet a 20m sail requirement, under guidance of the In-Space Propulsion (ISP) technology program office at MSFC. While these sails will not fly , they are the first steps in improving our understanding of the processes and phenomena at work. As part of the New Millennium Program (NMP) the ST9 technology validation mission hopes to launch and fly a solar sail by 2010 or sooner. Though the Solar Sail community has been studying and validating various concepts over two decades, it was not until recent breakthroughs in structural and material technology, has made possible to build sails that could be launched. With real sails that can be tested (albeit under earth conditions), the real task of engineering a viable spacecraft has finally commenced. Since it is not possible to accurately or practically recreate the actual operating conditions of the sailcraft (zero-G, vacuum and extremely low temperatures), much of the work has focused on developing accurate models that can be used to predict behavior in space, and for sails that are 6-10 times the size of currently existing sails. Since these models can be validated only with real test data under "earth" conditions, the process of modeling and the identification of uncertainty due to model assumptions and scope need to be closely considered. Sailcraft models that exist currently, are primarily focused on detailed physical representations at the component level, these are intended to support prototyping efforts. System level models that cut across different sail configurations and control concepts while maintaining a consistent approach are non-existent. Much effort has been focused on the areas of thrust performance, solar radiation prediction, and sail membrane behavior vis-a-vis their reflective geometry, such as wrinkling/folding/furling as it pertains to thrust prediction. A parallel effort has been conducted on developing usable models for developing attitude control systems (ACS), for different sail configurations in different regimes. There has been very little by way of a system wide exploration of the impact of the various control schemes, thrust prediction models for different sail configurations being considered.

  17. Interactive initialization of heat flux parameters for numerical models using satellite temperature measurements

    NASA Technical Reports Server (NTRS)

    Carlson, T. N. (Principal Investigator)

    1981-01-01

    Efforts were made (1) to bring the image processing and boundary layer model operation into a completely interactive mode and (2) to test a method for determining the surface energy budget and surface moisture availability and thermal inertia on a scale appreciably larger than that of the city. A region a few hundred kilometers on a side centered over southern Indiana was examined.

  18. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  19. Wave-Sediment Interaction in Muddy Environments: A Field Experiment

    DTIC Science & Technology

    2008-01-01

    project includes a field experiment on the Atchafalaya shelf, Louisiana, in Years 1 and 2 (2007-2008) and a data analysis and modeling effort in Year 3...2008), in collaboration with other researchers funded by ONR CG program. The pilot experiment has tested the instrumentation and data analysis ...1993; Foda et al., 1993). With the exception of liquefaction processes, these models assume a single, well­ defined mud phase. However

  20. Engaging partners to initiate evaluation efforts: tactics used and lessons learned from the prevention research centers program.

    PubMed

    Wright, Demia Sundra; Anderson, Lynda A; Brownson, Ross C; Gwaltney, Margaret K; Scherer, Jennifer; Cross, Alan W; Goodman, Robert M; Schwartz, Randy; Sims, Tom; White, Carol R

    2008-01-01

    The Centers for Disease Control and Prevention's (CDC's) Prevention Research Centers (PRC) Program underwent a 2-year evaluation planning project using a participatory process that allowed perspectives from the national community of PRC partners to be expressed and reflected in a national logic model. The PRC Program recognized the challenge in developing a feasible, useable, and relevant evaluation process for a large, diverse program. To address the challenge, participatory and utilization-focused evaluation models were used. Four tactics guided the evaluation planning process: 1) assessing stakeholders' communication needs and existing communication mechanisms and infrastructure; 2) using existing mechanisms and establishing others as needed to inform, educate, and request feedback; 3) listening to and using feedback received; and 4) obtaining adequate resources and building flexibility into the project plan to support multifaceted mechanisms for data collection. Participatory methods resulted in buy-in from stakeholders and the development of a national logic model. Benefits included CDC's use of the logic model for program planning and development of a national evaluation protocol and increased expectations among PRC partners for involvement. Challenges included the time, effort, and investment of program resources required for the participatory approach and the identification of whom to engage and when to engage them for feedback on project decisions. By using a participatory and utilization-focused model, program partners positively influenced how CDC developed an evaluation plan. The tactics we used can guide the involvement of program stakeholders and help with decisions on appropriate methods and approaches for engaging partners.

  1. Perspectives on why digital ecologies matter: combining population genetics and ecologically informed agent-based models with GIS for managing dipteran livestock pests.

    PubMed

    Peck, Steven L

    2014-10-01

    It is becoming clear that handling the inherent complexity found in ecological systems is an essential task for finding ways to control insect pests of tropical livestock such as tsetse flies, and old and new world screwworms. In particular, challenging multivalent management programs, such as Area Wide Integrated Pest Management (AW-IPM), face daunting problems of complexity at multiple spatial scales, ranging from landscape level processes to those of smaller scales such as the parasite loads of individual animals. Daunting temporal challenges also await resolution, such as matching management time frames to those found on ecological and even evolutionary temporal scales. How does one deal with representing processes with models that involve multiple spatial and temporal scales? Agent-based models (ABM), combined with geographic information systems (GIS), may allow for understanding, predicting and managing pest control efforts in livestock pests. This paper argues that by incorporating digital ecologies in our management efforts clearer and more informed decisions can be made. I also point out the power of these models in making better predictions in order to anticipate the range of outcomes possible or likely. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.

  2. Group Investigation: Structuring an Inquiry-Based Curriculum.

    ERIC Educational Resources Information Center

    Huhtala, Jack

    Group investigation is an organizational approach that allows a class to work actively and collaboratively in small groups and enables students to take an active role in determining their own learning goals and processes. As part of reform and restructuring efforts, Beaverton High School (Oregon) implemented the Group Investigation model with…

  3. Observations on Leadership, Problem Solving, and Preferred Futures of Universities

    ERIC Educational Resources Information Center

    Puncochar, Judith

    2013-01-01

    A focus on enrollments, rankings, uncertain budgets, and branding efforts to operate universities could have serious implications for discussions of sustainable solutions to complex problems and the decision-making processes of leaders. The Authentic Leadership Model for framing ill-defined problems in higher education is posited to improve the…

  4. Observations of bi-directional leader development in a triggered lightning flash

    NASA Technical Reports Server (NTRS)

    Laroche, P.; Idone, V.; Eybert-Berard, A.; Barret, L.

    1991-01-01

    Observations of a modified form of rocket triggered lightning are described. A flash triggered during the summer of 1989 is studied as part of an effort to model bidirectional discharge. It is suggested that the altitude triggering technique provides a realistic means of studying the attachment process.

  5. Processing and display of atmospheric phenomenaa data

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Garst, R. A.; Purser, L. R.

    1984-01-01

    A series of technical efforts dealing with various atmospheric phenomena is described. Refinements to the Potential in an Electrostatic Cloud (PEC) model are discussed. The development of an Apple III graphics program, the NSSL Lightning Data Program and a description of data reduction procedures are examined. Several utility programs are also discussed.

  6. Incremental Inductive Learning in a Constructivist Agent

    NASA Astrophysics Data System (ADS)

    Perotto, Filipo Studzinski; Älvares, Luís Otávio

    The constructivist paradigm in Artificial Intelligence has been definitively inaugurated in the earlier 1990's by Drescher's pioneer work [10]. He faces the challenge of design an alternative model for machine learning, founded in the human cognitive developmental process described by Piaget [x]. His effort has inspired many other researchers.

  7. Teacher Education: Considerations for a Knowledge Base Framework.

    ERIC Educational Resources Information Center

    Tumposky, Nancy

    Traditionally, the knowledge base has been defined more as product than process and has encompassed definitions, principles, values, and facts. Recent reforms in teaching and teacher education have brought about efforts to redefine the knowledge base. The reconceptualized knowledge base builds upon the earlier model but gives higher priority to…

  8. Collaborative Outcome Measurement: Development of the Nationally Standardized Minimum Data Set

    ERIC Educational Resources Information Center

    Stephens, Barry C.; Kirchner, Corinne; Orr, Alberta L.; Suvino, Dawn; Rogers, Priscilla

    2009-01-01

    This article discusses the challenging process of developing a common data set for independent living programs serving older adults who are visually impaired. The three-year project, which included collaborative efforts among many stakeholders that encompass diverse program models, resulted in the development of the Internet-based Nationally…

  9. Regulation of Motivation: Contextual and Social Aspects

    ERIC Educational Resources Information Center

    Wolters, Christopher A.

    2011-01-01

    Background: Models of self-regulated learning have been used extensively as a way of understanding how students understand, monitor, and manage their own academic functioning. The regulation of motivation is a facet of self-regulated learning that describes students' efforts to control their own motivation or motivational processing. The…

  10. Using multitype branching processes to quantify statistics of disease outbreaks in zoonotic epidemics

    USDA-ARS?s Scientific Manuscript database

    Despite the enormous relevance of zoonotic infections to world-wide public health, and despite much effort in modeling individual zoonoses, a fundamental understanding of the disease dynamics and the nature of outbreaks emanating from such a complex system is still lacking. We introduce a simple sto...

  11. Incorporating historical ecosystem diversity into conservation planning efforts in grass and shrub ecosystems

    Treesearch

    Amy C. Ganguli; Johathan B. Haufler; Carolyn A. Mehl; Jimmie D. Chew

    2011-01-01

    Understanding historical ecosystem diversity and wildlife habitat quality can provide a useful reference for managing and restoring rangeland ecosystems. We characterized historical ecosystem diversity using available empirical data, expert opinion, and the spatially explicit vegetation dynamics model SIMPPLLE (SIMulating Vegetative Patterns and Processes at Landscape...

  12. Listening Effort and Speech Recognition with Frequency Compression Amplification for Children and Adults with Hearing Loss.

    PubMed

    Brennan, Marc A; Lewis, Dawna; McCreery, Ryan; Kopun, Judy; Alexander, Joshua M

    2017-10-01

    Nonlinear frequency compression (NFC) can improve the audibility of high-frequency sounds by lowering them to a frequency where audibility is better; however, this lowering results in spectral distortion. Consequently, performance is a combination of the effects of increased access to high-frequency sounds and the detrimental effects of spectral distortion. Previous work has demonstrated positive benefits of NFC on speech recognition when NFC is set to improve audibility while minimizing distortion. However, the extent to which NFC impacts listening effort is not well understood, especially for children with sensorineural hearing loss (SNHL). To examine the impact of NFC on recognition and listening effort for speech in adults and children with SNHL. Within-subject, quasi-experimental study. Participants listened to amplified nonsense words that were (1) frequency-lowered using NFC, (2) low-pass filtered at 5 kHz to simulate the restricted bandwidth (RBW) of conventional hearing aid processing, or (3) low-pass filtered at 10 kHz to simulate extended bandwidth (EBW) amplification. Fourteen children (8-16 yr) and 14 adults (19-65 yr) with mild-to-severe SNHL. Participants listened to speech processed by a hearing aid simulator that amplified input signals to fit a prescriptive target fitting procedure. Participants were blinded to the type of processing. Participants' responses to each nonsense word were analyzed for accuracy and verbal-response time (VRT; listening effort). A multivariate analysis of variance and linear mixed model were used to determine the effect of hearing-aid signal processing on nonsense word recognition and VRT. Both children and adults identified the nonsense words and initial consonants better with EBW and NFC than with RBW. The type of processing did not affect the identification of the vowels or final consonants. There was no effect of age on recognition of the nonsense words, initial consonants, medial vowels, or final consonants. VRT did not change significantly with the type of processing or age. Both adults and children demonstrated improved speech recognition with access to the high-frequency sounds in speech. Listening effort as measured by VRT was not affected by access to high-frequency sounds. American Academy of Audiology

  13. Aging in the colonial chordate, Botryllus schlosseri.

    PubMed

    Munday, Roma; Rodriguez, Delany; Di Maio, Alessandro; Kassmer, Susannah; Braden, Brian; Taketa, Daryl A; Langenbacher, Adam; De Tomaso, Anthony

    2015-01-30

    What mechanisms underlie aging? One theory, the wear-and-tear model, attributes aging to progressive deterioration in the molecular and cellular machinery which eventually lead to death through the disruption of physiological homeostasis. The second suggests that life span is genetically programmed, and aging may be derived from intrinsic processes which enforce a non-random, terminal time interval for the survivability of the organism. We are studying an organism that demonstrates both properties: the colonial ascidian, Botryllus schlosseri. Botryllus is a member of the Tunicata, the sister group to the vertebrates, and has a number of life history traits which make it an excellent model for studies on aging. First, Botryllus has a colonial life history, and grows by a process of asexual reproduction during which entire bodies, including all somatic and germline lineages, regenerate every week, resulting in a colony of genetically identical individuals. Second, previous studies of lifespan in genetically distinct Botryllus lineages suggest that a direct, heritable basis underlying mortality exists that is unlinked to reproductive effort and other life history traits. Here we will review recent efforts to take advantage of the unique life history traits of B. schlosseri and develop it into a robust model for aging research.

  14. Magnetospheric-ionospheric Poynting flux

    NASA Technical Reports Server (NTRS)

    Thayer, Jeffrey P.

    1994-01-01

    Over the past three years of funding SRI, in collaboration with the University of Texas at Dallas, has been involved in determining the total electromagnetic energy flux into the upper atmosphere from DE-B electric and magnetic field measurements and modeling the electromagnetic energy flux at high latitudes, taking into account the coupled magnetosphere-ionosphere system. This effort has been very successful in establishing the DC Poynting flux as a fundamental quantity in describing the coupling of electromagnetic energy between the magnetosphere and ionosphere. The DE-B satellite electric and magnetic field measurements were carefully scrutinized to provide, for the first time, a large data set of DC, field-aligned, Poynting flux measurement. Investigations describing the field-aligned Poynting flux observations from DE-B orbits under specific geomagnetic conditions and from many orbits were conducted to provide a statistical average of the Poynting flux distribution over the polar cap. The theoretical modeling effort has provided insight into the observations by formulating the connection between Poynting's theorem and the electromagnetic energy conversion processes that occur in the ionosphere. Modeling and evaluation of these processes has helped interpret the satellite observations of the DC Poynting flux and improved our understanding of the coupling between the ionosphere and magnetosphere.

  15. Meter-Scale 3-D Models of the Martian Surface from Combining MOC and MOLA Data

    NASA Technical Reports Server (NTRS)

    Soderblom, Laurence A.; Kirk, Randolph L.

    2003-01-01

    We have extended our previous efforts to derive through controlled photoclinometry, accurate, calibrated, high-resolution topographic models of the martian surface. The process involves combining MGS MOLA topographic profiles and MGS MOC Narrow Angle images. The earlier work utilized, along with a particular MOC NA image, the MOLA topographic profile that was acquired simultaneously, in order to derive photometric and scattering properties of the surface and atmosphere so as to force the low spatial frequencies of a one-dimensional MOC photoclinometric model to match the MOLA profile. Both that work and the new results reported here depend heavily on successful efforts to: 1) refine the radiometric calibration of MOC NA; 2) register the MOC to MOLA coordinate systems and refine the pointing; and 3) provide the ability to project into a common coordinate system, simultaneously acquired MOC and MOLA with a single set of SPICE kernels utilizing the USGS ISIS cartographic image processing tools. The approach described in this paper extends the MOC-MOLA integration and cross-calibration procedures from one-dimensional profiles to full two-dimensional photoclinometry and image simulations. Included are methods to account for low-frequency albedo variations within the scene.

  16. Establishing the Common Community Physics Package by Transitioning the GFS Physics to a Collaborative Software Framework

    NASA Astrophysics Data System (ADS)

    Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.

    2017-12-01

    The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.

  17. Monaural and binaural processing of complex waveforms

    NASA Astrophysics Data System (ADS)

    Trahiotis, Constantine; Bernstein, Leslie R.

    1992-01-01

    Our research concerned the manners by which the monaural and binaural auditory systems process information in complex sounds. Substantial progress was made in three areas, consistent with the ojectives outlined in the original proposal. (1) New electronic equipment, including a NeXT computer was purchased, installed and interfaced with the existing laboratory. Software was developed for generating the necessary complex digital stimuli and for running behavioral experiments utilizing those stimuli. (2) Monaural experiments showed that the CMR is not obtained successively and is reduced or non-existent when the flanking bands are pulsed rather than presented continuously. Binaural investigations revealed that the detectability of a tonal target in a masking level difference paradigm could be degraded by the presence of a spectrally remote interfering tone. (3) In collaboration with Dr. Richard Stem, theoretical efforts included the explication and evaluation of a weighted-image model of binaural hearing, attempts to extend the Stern-Colbum position-variable model to account for many crucial lateralization and localization data gathered over the past 50 years, and the continuation of efforts to incorporate into a general model notions that lateralization and localization of spectrally-rich stimuli depend upon the patterns of neural activity within a plane defined by frequency and interaural delay.

  18. Aging in the colonial chordate, Botryllus schlosseri

    PubMed Central

    Munday, Roma; Rodriguez, Delany; Di Maio, Alessandro; Kassmer, Susannah; Braden, Brian; Taketa, Daryl A.; Langenbacher, Adam; De Tomaso, Anthony

    2015-01-01

    What mechanisms underlie aging? One theory, the wear-and-tear model, attributes aging to progressive deterioration in the molecular and cellular machinery which eventually lead to death through the disruption of physiological homeostasis. The second suggests that life span is genetically programmed, and aging may be derived from intrinsic processes which enforce a non-random, terminal time interval for the survivability of the organism. We are studying an organism that demonstrates both properties: the colonial ascidian, Botryllus schlosseri. Botryllus is a member of the Tunicata, the sister group to the vertebrates, and has a number of life history traits which make it an excellent model for studies on aging. First, Botryllus has a colonial life history, and grows by a process of asexual reproduction during which entire bodies, including all somatic and germline lineages, regenerate every week, resulting in a colony of genetically identical individuals. Second, previous studies of lifespan in genetically distinct Botryllus lineages suggest that a direct, heritable basis underlying mortality exists that is unlinked to reproductive effort and other life history traits. Here we will review recent efforts to take advantage of the unique life history traits of B. schlosseri and develop it into a robust model for aging research. PMID:26136620

  19. An experimental paradigm for team decision processes

    NASA Technical Reports Server (NTRS)

    Serfaty, D.; Kleinman, D. L.

    1986-01-01

    The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.

  20. Capturing Knowledge In Order To Optimize The Cutting Process For Polyethylene Pipes Using Knowledge Models

    NASA Astrophysics Data System (ADS)

    Rotaru, Ionela Magdalena

    2015-09-01

    Knowledge management is a powerful instrument. Areas where knowledge - based modelling can be applied are different from business, industry, government to education area. Companies engage in efforts to restructure the database held based on knowledge management principles as they recognize in it a guarantee of models characterized by the fact that they consist only from relevant and sustainable knowledge that can bring value to the companies. The proposed paper presents a theoretical model of what it means optimizing polyethylene pipes, thus bringing to attention two important engineering fields, the one of the metal cutting process and gas industry, who meet in order to optimize the butt fusion welding process - the polyethylene cutting part - of the polyethylene pipes. All approach is shaped on the principles of knowledge management. The study was made in collaboration with companies operating in the field.

  1. Growing up and Role Modeling: A Theory in Iranian Nursing Students’ Education

    PubMed Central

    Nouri, Jamileh Mokhtari; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid

    2015-01-01

    One of the key strategies in students’ learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors’ experiences about role modeling process. Data was analyzed by Glaserian’s Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions with 20 nursing students based on purposive and theoretical sampling was done for explaining role modeling process from four nursing faculties in Tehran. Through basic coding, an effort to comprehensive growth and excellence was made with the basic social process consisting the core category and through selective coding three phases were identified as: realizing and exposure to inadequate human and professional growth, facilitating human and professional growth and evolution. The role modeling process is taking place unconscious, involuntary, dynamic and with positive progressive process in order to facilitate overall growth in nursing student. Accordingly, the design and implementation of the designed model can be used to make this unconscious to conscious, active and voluntarily processes a process to help education administrators of nursing colleges and supra organization to prevent threats to human and professional in nursing students’ education and promote nursing students’ growth. PMID:25716391

  2. Kinetics of carbon clustering in detonation of high explosives: Does theory match experiment?

    NASA Astrophysics Data System (ADS)

    Velizhanin, Kirill; Watkins, Erik; Dattelbaum, Dana; Gustavsen, Richard; Aslam, Tariq; Podlesak, David; Firestone, Millicent; Huber, Rachel; Ringstrand, Bryan; Willey, Trevor; Bagge-Hansen, Michael; Hodgin, Ralph; Lauderbach, Lisa; van Buuren, Tony; Sinclair, Nicholas; Rigg, Paulo; Seifert, Soenke; Gog, Thomas

    2017-06-01

    Chemical reactions in detonation of carbon-rich high explosives yield carbon clusters as major constituents of the products. Efforts to model carbon clustering as a diffusion-limited irreversible coagulation of carbon clusters go back to the seminal paper by Shaw and Johnson. However, first direct experimental observations of the kinetics of clustering yielded cluster growth one to two orders of magnitude slower than theoretical predictions. Multiple efforts were undertaken to test and revise the basic assumptions of the model in order to achieve better agreement with experiment. We discuss our very recent direct experimental observations of carbon clustering dynamics and demonstrate that these new results are in much better agreement with the modified Shaw-Johnson model. The implications of this much better agreement on our present understanding of detonation carbon clustering processes and possible ways to increase the agreement between theory and experiment even further are discussed.

  3. Three-Dimensional Numerical Modeling of Magnetohydrodynamic Augmented Propulsion Experiment

    NASA Technical Reports Server (NTRS)

    Turner, M. W.; Hawk, C. W.; Litchford, R. J.

    2009-01-01

    Over the past several years, NASA Marshall Space Flight Center has engaged in the design and development of an experimental research facility to investigate the use of diagonalized crossed-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In support of this effort, a three-dimensional numerical MHD model has been developed for the purpose of analyzing and optimizing accelerator performance and to aid in understanding critical underlying physical processes and nonideal effects. This Technical Memorandum fully summarizes model development efforts and presents the results of pretest performance optimization analyses. These results indicate that the MHD accelerator should utilize a 45deg diagonalization angle with the applied current evenly distributed over the first five inlet electrode pairs. When powered at 100 A, this configuration is expected to yield a 50% global efficiency with an 80% increase in axial velocity and a 50% increase in centerline total pressure.

  4. Group-Based Active Learning of Classification Models.

    PubMed

    Luo, Zhipeng; Hauskrecht, Milos

    2017-05-01

    Learning of classification models from real-world data often requires additional human expert effort to annotate the data. However, this process can be rather costly and finding ways of reducing the human annotation effort is critical for this task. The objective of this paper is to develop and study new ways of providing human feedback for efficient learning of classification models by labeling groups of examples. Briefly, unlike traditional active learning methods that seek feedback on individual examples, we develop a new group-based active learning framework that solicits label information on groups of multiple examples. In order to describe groups in a user-friendly way, conjunctive patterns are used to compactly represent groups. Our empirical study on 12 UCI data sets demonstrates the advantages and superiority of our approach over both classic instance-based active learning work, as well as existing group-based active-learning methods.

  5. Active Learning of Classification Models with Likert-Scale Feedback.

    PubMed

    Xue, Yanbing; Hauskrecht, Milos

    2017-01-01

    Annotation of classification data by humans can be a time-consuming and tedious process. Finding ways of reducing the annotation effort is critical for building the classification models in practice and for applying them to a variety of classification tasks. In this paper, we develop a new active learning framework that combines two strategies to reduce the annotation effort. First, it relies on label uncertainty information obtained from the human in terms of the Likert-scale feedback. Second, it uses active learning to annotate examples with the greatest expected change. We propose a Bayesian approach to calculate the expectation and an incremental SVM solver to reduce the time complexity of the solvers. We show the combination of our active learning strategy and the Likert-scale feedback can learn classification models more rapidly and with a smaller number of labeled instances than methods that rely on either Likert-scale labels or active learning alone.

  6. Active Learning of Classification Models with Likert-Scale Feedback

    PubMed Central

    Xue, Yanbing; Hauskrecht, Milos

    2017-01-01

    Annotation of classification data by humans can be a time-consuming and tedious process. Finding ways of reducing the annotation effort is critical for building the classification models in practice and for applying them to a variety of classification tasks. In this paper, we develop a new active learning framework that combines two strategies to reduce the annotation effort. First, it relies on label uncertainty information obtained from the human in terms of the Likert-scale feedback. Second, it uses active learning to annotate examples with the greatest expected change. We propose a Bayesian approach to calculate the expectation and an incremental SVM solver to reduce the time complexity of the solvers. We show the combination of our active learning strategy and the Likert-scale feedback can learn classification models more rapidly and with a smaller number of labeled instances than methods that rely on either Likert-scale labels or active learning alone. PMID:28979827

  7. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crater, Jason; Galleher, Connor; Lievense, Jeff

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less

  8. The Content Validation and Resource Development For a Course in Materials and Processes of Industry Through the Use of NASA Experts at Norfolk State College. Final Report.

    ERIC Educational Resources Information Center

    Jacobs, James A.

    In an effort to develop a course in materials and processes of industry at Norfolk State College using Barton Herrscher's model of systematic instruction, a group of 12 NASA-Langley Research Center's (NASA-LRC) research engineers and technicians were recruited. The group acted as consultants in validating the content of the course and aided in…

  9. Cognitive Load and Listening Effort: Concepts and Age-Related Considerations.

    PubMed

    Lemke, Ulrike; Besser, Jana

    2016-01-01

    Listening effort has been recognized as an important dimension of everyday listening, especially with regard to the comprehension of spoken language. At constant levels of comprehension performance, the level of effort exerted and perceived during listening can differ considerably across listeners and situations. In this article, listening effort is used as an umbrella term for two different types of effort that can arise during listening. One of these types is processing effort, which is used to denote the utilization of "extra" mental processing resources in listening conditions that are adverse for an individual. A conceptual description is introduced how processing effort could be defined in terms of situational influences, the listener's auditory and cognitive resources, and the listener's personal state. Also, the proposed relationship between processing effort and subjectively perceived listening effort is discussed. Notably, previous research has shown that the availability of mental resources, as well as the ability to use them efficiently, changes over the course of adult aging. These common age-related changes in cognitive abilities and their neurocognitive organization are discussed in the context of the presented concept, especially regarding situations in which listening effort may be increased for older people.

  10. Separate valuation subsystems for delay and effort decision costs.

    PubMed

    Prévost, Charlotte; Pessiglione, Mathias; Météreau, Elise; Cléry-Melin, Marie-Laure; Dreher, Jean-Claude

    2010-10-20

    Decision making consists of choosing among available options on the basis of a valuation of their potential costs and benefits. Most theoretical models of decision making in behavioral economics, psychology, and computer science propose that the desirability of outcomes expected from alternative options can be quantified by utility functions. These utility functions allow a decision maker to assign subjective values to each option under consideration by weighting the likely benefits and costs resulting from an action and to select the one with the highest subjective value. Here, we used model-based neuroimaging to test whether the human brain uses separate valuation systems for rewards (erotic stimuli) associated with different types of costs, namely, delay and effort. We show that humans devalue rewards associated with physical effort in a strikingly similar fashion to those they devalue that are associated with delays, and that a single computational model derived from economics theory can account for the behavior observed in both delay discounting and effort discounting. However, our neuroimaging data reveal that the human brain uses distinct valuation subsystems for different types of costs, reflecting in opposite fashion delayed reward and future energetic expenses. The ventral striatum and the ventromedial prefrontal cortex represent the increasing subjective value of delayed rewards, whereas a distinct network, composed of the anterior cingulate cortex and the anterior insula, represent the decreasing value of the effortful option, coding the expected expense of energy. Together, these data demonstrate that the valuation processes underlying different types of costs can be fractionated at the cerebral level.

  11. Integrating automatic and controlled processes into neurocognitive models of social cognition.

    PubMed

    Satpute, Ajay B; Lieberman, Matthew D

    2006-03-24

    Interest in the neural systems underlying social perception has expanded tremendously over the past few decades. However, gaps between behavioral literatures in social perception and neuroscience are still abundant. In this article, we apply the concept of dual-process models to neural systems in an effort to bridge the gap between many of these behavioral studies and neural systems underlying social perception. We describe and provide support for a neural division between reflexive and reflective systems. Reflexive systems correspond to automatic processes and include the amygdala, basal ganglia, ventromedial prefrontal cortex, dorsal anterior cingulate cortex, and lateral temporal cortex. Reflective systems correspond to controlled processes and include lateral prefrontal cortex, posterior parietal cortex, medial prefrontal cortex, rostral anterior cingulate cortex, and the hippocampus and surrounding medial temporal lobe region. This framework is considered to be a working model rather than a finished product. Finally, the utility of this model and its application to other social cognitive domains such as Theory of Mind are discussed.

  12. Flaws in the Flow: The Weakness of Unstructured Business Process Modeling Languages Dealing with Data

    NASA Astrophysics Data System (ADS)

    Combi, Carlo; Gambini, Mauro

    Process-Aware Information Systems (PAISs) need more flexibility for supporting complex and varying human activities. PAISs usually support business process design by means of graphical graph-oriented business process modeling languages (BPMLs) in conjunction with textual executable specifications. In this paper we discuss the flexibility of such BPMLs which are the main interface for users that need to change the behavior of PAISs. In particular, we show how common BPMLs features, that seem good when considered alone, have a negative impact on flexibility when they are combined together for providing a complete executable specification. A model has to be understood before being changed and a change is made only when the benefits outweigh the effort. Two main factors have a great impact on comprehensibility and ease of change: concurrency and modularity. We show why BPMLs usually offer a limited concurrency model and lack of modularity; finally we discuss how to overcome these problems.

  13. Extending the cost-benefit model of thermoregulation: high-temperature environments.

    PubMed

    Vickers, Mathew; Manicom, Carryn; Schwarzkopf, Lin

    2011-04-01

    The classic cost-benefit model of ectothermic thermoregulation compares energetic costs and benefits, providing a critical framework for understanding this process (Huey and Slatkin 1976 ). It considers the case where environmental temperature (T(e)) is less than the selected temperature of the organism (T(sel)), and it predicts that, to minimize increasing energetic costs of thermoregulation as habitat thermal quality declines, thermoregulatory effort should decrease until the lizard thermoconforms. We extended this model to include the case where T(e) exceeds T(sel), and we redefine costs and benefits in terms of fitness to include effects of body temperature (T(b)) on performance and survival. Our extended model predicts that lizards will increase thermoregulatory effort as habitat thermal quality declines, gaining the fitness benefits of optimal T(b) and maximizing the net benefit of activity. Further, to offset the disproportionately high fitness costs of high T(e) compared with low T(e), we predicted that lizards would thermoregulate more effectively at high values of T(e) than at low ones. We tested our predictions on three sympatric skink species (Carlia rostralis, Carlia rubrigularis, and Carlia storri) in hot savanna woodlands and found that thermoregulatory effort increased as thermal quality declined and that lizards thermoregulated most effectively at high values of T(e).

  14. A Framework for Widespread Replication of a Highly Spatially Resolved Childhood Lead Exposure Risk Model

    PubMed Central

    Kim, Dohyeong; Galeano, M. Alicia Overstreet; Hull, Andrew; Miranda, Marie Lynn

    2008-01-01

    Background Preventive approaches to childhood lead poisoning are critical for addressing this longstanding environmental health concern. Moreover, increasing evidence of cognitive effects of blood lead levels < 10 μg/dL highlights the need for improved exposure prevention interventions. Objectives Geographic information system–based childhood lead exposure risk models, especially if executed at highly resolved spatial scales, can help identify children most at risk of lead exposure, as well as prioritize and direct housing and health-protective intervention programs. However, developing highly resolved spatial data requires labor-and time-intensive geocoding and analytical processes. In this study we evaluated the benefit of increased effort spent geocoding in terms of improved performance of lead exposure risk models. Methods We constructed three childhood lead exposure risk models based on established methods but using different levels of geocoded data from blood lead surveillance, county tax assessors, and the 2000 U.S. Census for 18 counties in North Carolina. We used the results to predict lead exposure risk levels mapped at the individual tax parcel unit. Results The models performed well enough to identify high-risk areas for targeted intervention, even with a relatively low level of effort on geocoding. Conclusions This study demonstrates the feasibility of widespread replication of highly spatially resolved childhood lead exposure risk models. The models guide resource-constrained local health and housing departments and community-based organizations on how best to expend their efforts in preventing and mitigating lead exposure risk in their communities. PMID:19079729

  15. Principal process analysis of biological models.

    PubMed

    Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc

    2018-06-14

    Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.

  16. Correlation models for waste tank sludges and slurries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, L.A.; Trent, D.S.

    This report presents the results of work conducted to support the TEMPEST computer modeling under the Flammable Gas Program (FGP) and to further the comprehension of the physical processes occurring in the Hanford waste tanks. The end products of this task are correlation models (sets of algorithms) that can be added to the TEMPEST computer code to improve the reliability of its simulation of the physical processes that occur in Hanford tanks. The correlation models can be used to augment, not only the TEMPEST code, but other computer codes that can simulate sludge motion and flammable gas retention. This reportmore » presents the correlation models, also termed submodels, that have been developed to date. The submodel-development process is an ongoing effort designed to increase our understanding of sludge behavior and improve our ability to realistically simulate the sludge fluid characteristics that have an impact on safety analysis. The effort has employed both literature searches and data correlation to provide an encyclopedia of tank waste properties in forms that are relatively easy to use in modeling waste behavior. These properties submodels will be used in other tasks to simulate waste behavior in the tanks. Density, viscosity, yield strength, surface tension, heat capacity, thermal conductivity, salt solubility, and ammonia and water vapor pressures were compiled for solutions and suspensions of sodium nitrate and other salts (where data were available), and the data were correlated by linear regression. In addition, data for simulated Hanford waste tank supernatant were correlated to provide density, solubility, surface tension, and vapor pressure submodels for multi-component solutions containing sodium hydroxide, sodium nitrate, sodium nitrite, and sodium aluminate.« less

  17. Rayleigh-Taylor and Richtmyer-Meshkov instability induced flow, turbulence, and mixing. I

    NASA Astrophysics Data System (ADS)

    Zhou, Ye

    2017-12-01

    Rayleigh-Taylor (RT) and Richtmyer-Meshkov (RM) instabilities play an important role in a wide range of engineering, geophysical, and astrophysical flows. They represent a triggering event that, in many cases, leads to large-scale turbulent mixing. Much effort has been expended over the past 140 years, beginning with the seminal work of Lord Rayleigh, to predict the evolution of the instabilities and of the instability-induced mixing layers. The objective of Part I of this review is to provide the basic properties of the flow, turbulence, and mixing induced by RT, RM, and Kelvin-Helmholtz (KH) instabilities. Historical efforts to study these instabilities are briefly reviewed, and the significance of these instabilities is discussed for a variety of flows, particularly for astrophysical flows and for the case of inertial confinement fusion. Early experimental efforts are described, and analytical attempts to model the linear, and nonlinear regimes of these mixing layers are examined. These analytical efforts include models for both single-mode and multi-mode initial conditions, as well as multi-scale models to describe the evolution. Comparisons of these models and theories to experimental and simulation studies are then presented. Next, attention is paid to the issue of the influence of stabilizing mechanisms (e.g., viscosity, surface tension, and diffuse interface) on the evolution of these instabilities, as well as the limitations and successes of numerical methods. Efforts to study these instabilities and mixing layers using group-theoretic ideas, as well as more formal notions of turbulence cascade processes during the later stages of the induced mixing layers, are inspected. A key element of the review is the discussion of the late-time self-similar scaling for the RT and RM growth factors, α and θ. These parameters are influenced by the initial conditions and much of the observed variation can be explained by this. In some cases, these instabilities induced flows can transition to turbulence. Both the spatial and temporal criteria to achieve the transition to turbulence have been examined. Finally, a description of the energy-containing scales in the mixing layers, including energy "injection" and cascade processes are presented in greater detail. Part II of this review is designed to provide a much broader and in-depth understanding of this critical area of research (Zhou, 2017. Physics Reports, 723-725, 1-160).

  18. Roll-to-Roll Advanced Materials Manufacturing DOE Lab Consortium - FY16 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel, Claus; Wood, III, David L.; Krumdick, Gregory

    2016-12-01

    A DOE laboratory consortium comprised of ORNL, ANL, NREL and LBNL, coordinating with Kodak’s Eastman Business Park (Kodak) and other selected industry partners, was formed to address enhancing battery electrode performance and R2R manufacturing challenges. The objective of the FY 2016 seed project was to develop a materials genome synthesis process amenable to R2R manufacturing and to provide modeling, simulation, processing, and manufacturing techniques that demonstrate the feasibility of process controls and scale-up potential for improved battery electrodes. The research efforts were to predict and measure changes and results in electrode morphology and performance based on process condition changes; tomore » evaluate mixed, active, particle size deposition and drying for novel electrode materials; and to model various process condition changes and the resulting morphology and electrode performance.« less

  19. Multi-tissue DNA methylation age: Molecular relationships and perspectives for advancing biomarker utility.

    PubMed

    Nwanaji-Enwerem, Jamaji C; Weisskopf, Marc G; Baccarelli, Andrea A

    2018-04-23

    The multi-tissue DNA methylation estimator of chronological age (DNAm-age) has been associated with a wide range of exposures and health outcomes. Still, it is unclear how DNAm-age can have such broad relationships and how it can be best utilized as a biomarker. Understanding DNAm-age's molecular relationships is a promising approach to address this critical knowledge gap. In this review, we discuss the existing literature regarding DNAm-age's molecular relationships in six major categories: animal model systems, cancer processes, cellular aging processes, immune system processes, metabolic processes, and nucleic acid processes. We also present perspectives regarding the future of DNAm-age research, including the need to translate a greater number of ongoing research efforts to experimental and animal model systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. The DOE water cycle pilot study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, N. L.; King, A. W.; Miller, M. A.

    In 1999, the U.S. Global Change Research Program (USGCRP) formed a Water Cycle Study Group (Hornberger et al. 2001) to organize research efforts in regional hydrologic variability, the extent to which this variability is caused by human activity, and the influence of ecosystems. The USGCRP Water Cycle Study Group was followed by a U.S. Department of Energy (DOE) Water Cycle Research Plan (Department of Energy 2002) that outlined an approach toward improving seasonal-to-interannual hydroclimate predictability and closing a regional water budget. The DOE Water Cycle Research Plan identified key research areas, including a comprehensive long-term observational database to support modelmore » development, and to develop a better understanding of the relationship between the components of local water budgets and large scale processes. In response to this plan, a multilaboratory DOE Water Cycle Pilot Study (WCPS) demonstration project began with a focus on studying the water budget and its variability at multiple spatial scales. Previous studies have highlighted the need for continued efforts to observationally close a local water budget, develop a numerical model closure scheme, and further quantify the scales in which predictive accuracy are optimal. A concerted effort within the National Oceanic and Atmospheric Administration (NOAA)-funded Global Energy and Water Cycle Experiment (GEWEX) Continental-scale International Project (GCIP) put forth a strategy to understand various hydrometeorological processes and phenomena with an aim toward closing the water and energy budgets of regional watersheds (Lawford 1999, 2001). The GCIP focus on such regional budgets includes the measurement of all components and reduction of the error in the budgets to near zero. To approach this goal, quantification of the uncertainties in both measurements and modeling is required. Model uncertainties within regional climate models continue to be evaluated within the Program to Intercompare Regional Climate Simulations (Takle et al. 1999), and model uncertainties within land surface models are being evaluated within the Program to Intercompare Land Surface Schemes (e.g., Henderson-Sellers 1993; Wood et al. 1998; Lohmann et al. 1998). In the context of understanding the water budget at watershed scales, the following two research questions that highlight DOE's unique water isotope analysis and high-performance modeling capabilities were posed as the foci of this pilot study: (1) Can the predictability of the regional water budget be improved using high-resolution model simulations that are constrained and validated with new hydrospheric water measurements? (2) Can water isotopic tracers be used to segregate different pathways through the water cycle and predict a change in regional climate patterns? To address these questions, numerical studies using regional atmospheric-land surface models and multiscale land surface hydrologic models were generated and, to the extent possible, the results were evaluated with observations. While the number of potential processes that may be important in the local water budget is large, several key processes were examined in detail. Most importantly, a concerted effort was made to understand water cycle processes and feedbacks at the land surface-atmosphere interface at spatial scales ranging from 30 m to hundreds of kilometers. A simple expression for the land surface water budget at the watershed scale is expressed as {Delta}S = P + G{sub in} - ET - Q - G{sub out}, where {Delta}S is the change in water storage, P is precipitation, ET is evapotranspiration, Q is streamflow, G{sub in} is groundwater entering the watershed, and G{sub out} is groundwater leaving the watershed, per unit time. The WCPS project identified data gaps and necessary model improvements that will lead to a more accurate representation of the terms in Eq. (1). Table 1 summarizes the components of this water cycle pilot study and the respective participants. The following section provides a description of the surface observation and modeling sites. This is followed by a section on model analyses, and then the summary and concluding remarks.« less

  1. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  2. Molluscan cells in culture: primary cell cultures and cell lines

    PubMed Central

    Yoshino, T. P.; Bickham, U.; Bayne, C. J.

    2013-01-01

    In vitro cell culture systems from molluscs have significantly contributed to our basic understanding of complex physiological processes occurring within or between tissue-specific cells, yielding information unattainable using intact animal models. In vitro cultures of neuronal cells from gastropods show how simplified cell models can inform our understanding of complex networks in intact organisms. Primary cell cultures from marine and freshwater bivalve and gastropod species are used as biomonitors for environmental contaminants, as models for gene transfer technologies, and for studies of innate immunity and neoplastic disease. Despite efforts to isolate proliferative cell lines from molluscs, the snail Biomphalaria glabrata Say, 1818 embryonic (Bge) cell line is the only existing cell line originating from any molluscan species. Taking an organ systems approach, this review summarizes efforts to establish molluscan cell cultures and describes the varied applications of primary cell cultures in research. Because of the unique status of the Bge cell line, an account is presented of the establishment of this cell line, and of how these cells have contributed to our understanding of snail host-parasite interactions. Finally, we detail the difficulties commonly encountered in efforts to establish cell lines from molluscs and discuss how these difficulties might be overcome. PMID:24198436

  3. Modelling High-temperature EBPR by Incorporating Glycogen and GAOs: Challenges from a Preliminary Study.

    PubMed

    Liau, Kee Fui; Yeoh, Hak Koon; Shoji, Tadashi; Chua, Adeline Seak May; Ho, Pei Yee

    2017-01-01

      Recently reported kinetic and stoichiometric parameters of the Activated Sludge Model no. 2d (ASM2d) for high-temperature EBPR processes suggested that the absence of glycogen in the model contributed to underestimation of PHA accumulation at 32 °C. Here, two modified ASM2d models were used to further explore the contribution of glycogen in the process. The ASM2d-1G model incorporated glycogen metabolism by PAOs (polyphosphate-accumulating organisms), while the ASM2d-2G model further included processes by GAOs (glycogen-accumulating organisms). These models were calibrated and validated using experimental data at 32 °C. The ASM2d-1G model supported the hypothesis that the excess PHA was attributed to glycogen, but remained inadequate to capture the dynamics of glycogen without considering GAOs activities. The ASM2d-2G model performed better, but it was challenging to calibrate as it often led to wash-out of either PAOs or GAOs. Associated hurdles are highlighted and additional efforts in calibrating ASM2d-2G more effectively are proposed.

  4. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review

    PubMed Central

    Ngoepe, Malebogo N.; Frangi, Alejandro F.; Byrne, James V.; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities. PMID:29670533

  5. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review.

    PubMed

    Ngoepe, Malebogo N; Frangi, Alejandro F; Byrne, James V; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities.

  6. Experimental and numerical study of two dimensional heat and mass transfer in unsaturated soil with and application to soil thermal energy storage (SBTES) systems

    NASA Astrophysics Data System (ADS)

    Moradi, A.; Smits, K. M.

    2014-12-01

    A promising energy storage option to compensate for daily and seasonal energy offsets is to inject and store heat generated from renewable energy sources (e.g. solar energy) in the ground, oftentimes referred to as soil borehole thermal energy storage (SBTES). Nonetheless in SBTES modeling efforts, it is widely recognized that the movement of water vapor is closely coupled to thermal processes. However, their mutual interactions are rarely considered in most soil water modeling efforts or in practical applications. The validation of numerical models that are designed to capture these processes is difficult due to the scarcity of experimental data, limiting the testing and refinement of heat and water transfer theories. A common assumption in most SBTES modeling approaches is to consider the soil as a purely conductive medium with constant hydraulic and thermal properties. However, this simplified approach can be improved upon by better understanding the coupled processes at play. Consequently, developing new modeling techniques along with suitable experimental tools to add more complexity in coupled processes has critical importance in obtaining necessary knowledge in efficient design and implementation of SBTES systems. The goal of this work is to better understand heat and mass transfer processes for SBTES. In this study, we implemented a fully coupled numerical model that solves for heat, liquid water and water vapor flux and allows for non-equilibrium liquid/gas phase change. This model was then used to investigate the influence of different hydraulic and thermal parameterizations on SBTES system efficiency. A two dimensional tank apparatus was used with a series of soil moisture, temperature and soil thermal properties sensors. Four experiments were performed with different test soils. Experimental results provide evidences of thermally induced moisture flow that was also confirmed by numerical results. Numerical results showed that for the test conditions applied here, moisture flow is more influenced by thermal gradients rather than hydraulic gradients. The results also demonstrate that convective fluxes are higher compared to conductive fluxes indicating that moisture flow has more contribution to the overall heat flux than conductive fluxes.

  7. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  8. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Taylor, W. E.; Schwartz, F. M.

    1975-01-01

    A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.

  9. Integrated Modeling of the Human-Natural System to Improve Local Water Management and Planning

    NASA Astrophysics Data System (ADS)

    Gutowski, W. J., Jr.; Dziubanski, D.; Franz, K.; Goodwin, J.; Rehmann, C. R.; Simpkins, W. W.; Tesfastion, L.; Wanamaker, A. D.; Jie, Y.

    2015-12-01

    Communities across the world are experiencing the effects of unsustainable water management practices. Whether the problem is a lack of water, too much water, or water of degraded quality, finding acceptable solutions requires community-level efforts that integrate sound science with local needs and values. Our project develops both a software technology (agent-based hydrological modeling) and a social technology (a participatory approach to model development) that will allow communities to comprehensively address local water challenges. Using agent-based modeling (ABM), we are building a modeling system that includes a semi-distributed hydrologic process model coupled with agent (stakeholder) models. Information from the hydrologic model is conveyed to the agent models, which, along with economic information, determine appropriate agent actions that subsequently affect hydrology within the model. The iterative participatory modeling (IPM) process will assist with the continual development of the agent models. Further, IPM creates a learning environment in which all participants, including researchers, are co-exploring relevant data, possible scenarios and solutions, and viewpoints through continuous interactions. Our initial work focuses on the impact of flood mitigation and conservation efforts on reducing flooding in an urban area. We are applying all research elements above to the Squaw Creek watershed that flows through parts of four counties in central Iowa. The watershed offers many of the typical tensions encountered in Iowa, such as different perspectives on water management between upstream farmers and downstream urban areas, competition for various types of recreational services, and increasing absentee land ownership that may conflict with community values. Ultimately, climate change scenarios will be incorporated into the model to determine long term patterns that may develop within the social or natural system.

  10. Climate Science Performance, Data and Productivity on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L

    2015-01-01

    Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less

  11. Leveraging annotation-based modeling with Jump.

    PubMed

    Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti

    2018-01-01

    The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.

  12. Geometric-Optical Modeling of Directional Thermal Radiance for Improvement of Land Surface Temperature Retrievals from MODIS, ASTER, and Landsat-7 Instruments

    NASA Technical Reports Server (NTRS)

    Li, Xiaowen; Friedl, Mark; Strahler, Alan

    2002-01-01

    The general objectives of this project were to improve understanding of the directional emittance properties of land surfaces in the thermal infrared (TIR) region of the electro-magnetic spectrum. To accomplish these objectives our research emphasized a combination of theoretical model development and empirical studies designed to improve land surface temperature (LST) retrievals from space-borne remote sensing instruments. Following the proposal, the main tasks for this project were to: (1) Participate in field campaigns; (2) Acquire and process field, aircraft, and ancillary data; (3) Develop and refine models of LST emission; (4) Develop algorithms for LST retrieval; and (5) Explore LST retrieval methods for use in energy balance models. In general all of these objectives were addressed, and for the most part achieved. The main results from this project are described in the publications arising from this effort. We summarize our efforts related to each of the objectives.

  13. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  14. Progress on geoenvironmental models for selected mineral deposit types, edited by R. R. Seal, II and N. K. Foley

    USGS Publications Warehouse

    Seal, Robert R.; Foley, Nora K.

    2002-01-01

    Since the beginning of economic geology as a subdiscipline of the geological sciences, economic geologists have tended to classify mineral deposits on the basis of geological, mineralogical, and geochemical criteria, in efforts to systematize our understanding of mineral deposits as an aid to exploration. These efforts have led to classifications based on commodity, geologic setting (Cox and Singer, 1986), inferred temperatures and pressures of ore formation (Lindgren, 1933), and genetic setting (Park and MacDiarmid, 1975; Jensen and Bateman, 1979). None of these classification schemes is mutually exclusive; instead, there is considerable overlap among all of these classifications. A natural outcome of efforts to classify mineral deposits is the development of “mineral deposit models.” A mineral deposit model is a systematically arranged body of information that describes some or all of the essential characteristics of a selected group of mineral deposits; it presents a concept within which essential attributes may be distinguished and from which extraneous, coincidental features may be recognized and excluded (Barton, 1993). Barton (1993) noted that the grouping of deposits on the basis of common characteristics forms the basis for a classification, but the specification of the characteristics required for belonging to the group is the basis for a model. Models range from purely descriptive to genetic. A genetic model is superior to a descriptive model because it provides a basis to distinguish essential from extraneous attributes, and it has flexibility to accommodate variability in sources, processes, and local controls. In general, a descriptive model is a necessary prerequisite to a genetic model.

  15. Comparative analysis of numerical models of pipe handling equipment used in offshore drilling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlus, Witold, E-mail: witold.p.pawlus@ieee.org; Ebbesen, Morten K.; Hansen, Michael R.

    Design of offshore drilling equipment is a task that involves not only analysis of strict machine specifications and safety requirements but also consideration of changeable weather conditions and harsh environment. These challenges call for a multidisciplinary approach and make the design process complex. Various modeling software products are currently available to aid design engineers in their effort to test and redesign equipment before it is manufactured. However, given the number of available modeling tools and methods, the choice of the proper modeling methodology becomes not obvious and – in some cases – troublesome. Therefore, we present a comparative analysis ofmore » two popular approaches used in modeling and simulation of mechanical systems: multibody and analytical modeling. A gripper arm of the offshore vertical pipe handling machine is selected as a case study for which both models are created. In contrast to some other works, the current paper shows verification of both systems by benchmarking their simulation results against each other. Such criteria as modeling effort and results accuracy are evaluated to assess which modeling strategy is the most suitable given its eventual application.« less

  16. Transforming data into usable knowledge: the CIRC experience

    NASA Astrophysics Data System (ADS)

    Mote, P.; Lach, D.; Hartmann, H.; Abatzoglou, J. T.; Stevenson, J.

    2017-12-01

    NOAA's northwest RISA, the Climate Impacts Research Consortium, emphasizes the transformation of data into usable knowledge. This effort involves physical scientists (e.g., Abatzoglou) building web-based tools with climate and hydrologic data and model output, a team performing data mining to link crop loss claims to droughts, social scientists (eg., Lach, Hartmann) evaluating the effectiveness of such tools at communicating with end users, and two-way engagement with a wide variety of audiences who are interested in using and improving the tools. Unusual in this effort is the seamless integration across timescales past, present, and future; data mining; and the level of effort in evaluating the tools. We provide examples of agriculturally relevant climate variables (e.g. growing degree days, day of first fall freeze) and describe the iterative process of incorporating user feedback.

  17. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  18. A Study of Upgraded Phenolic Curing for RSRM Nozzle Rings

    NASA Technical Reports Server (NTRS)

    Smartt, Ziba

    2000-01-01

    A thermochemical cure model for predicting temperature and degree of cure profiles in curing phenolic parts was developed, validated and refined over several years. The model supports optimization of cure cycles and allows input of properties based upon the types of material and the process by which these materials are used to make nozzle components. The model has been refined to use sophisticated computer graphics to demonstrate the changes in temperature and degree of cure during the curing process. The effort discussed in the paper will be the conversion from an outdated solid modeling input program and SINDA analysis code to an integrated solid modeling and analysis package (I-DEAS solid model and TMG). Also discussed will be the incorporation of updated material properties obtained during full scale curing tests into the cure models and the results for all the Reusable Solid Rocket Motor (RSRM) nozzle rings.

  19. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  20. Craving to Quit: psychological models and neurobiological mechanisms of mindfulness training as treatment for addictions

    PubMed Central

    Brewer, Judson A.; Elwafi, Hani M.; Davis, Jake H.

    2012-01-01

    Humans suffer heavily from substance use disorders and other addictions. Despite much effort that has been put into understanding the mechanisms of the addictive process, treatment strategies have remained sub-optimal over the past several decades. Mindfulness training, which is based on ancient Buddhist models of human suffering, has recently shown preliminary efficacy in treating addictions. Interestingly, these early models show remarkable similarity to current models of the addictive process, especially in their overlap with operant conditioning (positive and negative reinforcement). Further, they may provide explanatory power for the mechanisms of mindfulness training, including its effects on core addictive elements, such as craving, and the underlying neurobiological processes that may be active therein. In this review, using smoking as an example, we will highlight similarities between ancient and modern views of the addictive process, review studies of mindfulness training for addictions and their effects on craving and other components of this process, and discuss recent neuroimaging findings that may inform our understanding of the neural mechanisms of mindfulness training. PMID:22642859

  1. Craving to quit: psychological models and neurobiological mechanisms of mindfulness training as treatment for addictions.

    PubMed

    Brewer, Judson A; Elwafi, Hani M; Davis, Jake H

    2013-06-01

    Humans suffer heavily from substance use disorders and other addictions. Despite much effort that has been put into understanding the mechanisms of the addictive process, treatment strategies have remained suboptimal over the past several decades. Mindfulness training, which is based on ancient Buddhist models of human suffering, has recently shown preliminary efficacy in treating addictions. These early models show remarkable similarity to current models of the addictive process, especially in their overlap with operant conditioning (positive and negative reinforcement). Further, they may provide explanatory power for the mechanisms of mindfulness training, including its effects on core addictive elements, such as craving, and the underlying neurobiological processes that may be active therein. In this review, using smoking as an example, we will highlight similarities between ancient and modern views of the addictive process, review studies of mindfulness training for addictions and their effects on craving and other components of this process, and discuss recent neuroimaging findings that may inform our understanding of the neural mechanisms of mindfulness training. 2013 APA, all rights reserved

  2. How Implementation of TQM and the Development of a Process Improvement Model, Within a Forward Support Battalion, Can Improve Preparation of the Material Condition Status Report (DA Form 2406)

    DTIC Science & Technology

    1990-12-01

    studies for the continuing education of managers new to the TQM approach , for informing vendors of their responsibilities under a changed process, and...Department of Defense (DoD) is adopting a management approach known as Total Quality Management (TQM) in an effort to improve quality and productivity...individuals selected be highly knowledgeable about the operations in their shop or unit. The main function of PATs is to collect and summarize process data for

  3. Theoretical research program to study chemical reactions in AOTV bow shock tubes

    NASA Technical Reports Server (NTRS)

    Taylor, Peter R.

    1993-01-01

    The main focus was the development, implementation, and calibration of methods for performing molecular electronic structure calculations to high accuracy. These various methods were then applied to a number of chemical reactions and species of interest to NASA, notably in the area of combustion chemistry. Among the development work undertaken was a collaborative effort to develop a program to efficiently predict molecular structures and vibrational frequencies using energy derivatives. Another major development effort involved the design of new atomic basis sets for use in chemical studies: these sets were considerably more accurate than those previously in use. Much effort was also devoted to calibrating methods for computing accurate molecular wave functions, including the first reliable calibrations for realistic molecules using full CI results. A wide variety of application calculations were undertaken. One area of interest was the spectroscopy and thermochemistry of small molecules, including establishing small molecule binding energies to an accuracy rivaling, or even on occasion surpassing, the experiment. Such binding energies are essential input to modeling chemical reaction processes, such as combustion. Studies of large molecules and processes important in both hydrogen and hydrocarbon combustion chemistry were also carried out. Finally, some effort was devoted to the structure and spectroscopy of small metal clusters, with applications to materials science problems.

  4. Microstructure Engineering in Hot Strip Mills, Part 1 of 2: Integrated mathematical Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.K. Brimacombe; I.V. Samaraseker; E.B. Hawbolt

    1998-09-30

    This report describes the work of developing an integrated model used to predict the thermal history, deformation, roll forces, microstructural evaluation and mechanical properties of steel strip in a hot-strip mill. This achievement results from a join research effort that is part of the American Iron and Steel Institute's (AISI) Advanced Process Control Program, a collaboration between the U.S. DOE and fifteen North American steel makers.

  5. Reduced order model of a blended wing body aircraft configuration

    NASA Astrophysics Data System (ADS)

    Stroscher, F.; Sika, Z.; Petersson, O.

    2013-12-01

    This paper describes the full development process of a numerical simulation model for the ACFA2020 (Active Control for Flexible 2020 Aircraft) blended wing body (BWB) configuration. Its requirements are the prediction of aeroelastic and flight dynamic response in time domain, with relatively small model order. Further, the model had to be parameterized with regard to multiple fuel filling conditions, as well as flight conditions. High efforts have been conducted in high-order aerodynamic analysis, for subsonic and transonic regime, by several project partners. The integration of the unsteady aerodynamic databases was one of the key issues in aeroelastic modeling.

  6. Collaborative simulations and experiments for a novel yield model of coal devolatilization in oxy-coal combustion conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iavarone, Salvatore; Smith, Sean T.; Smith, Philip J.

    Oxy-coal combustion is an emerging low-cost “clean coal” technology for emissions reduction and Carbon Capture and Sequestration (CCS). The use of Computational Fluid Dynamics (CFD) tools is crucial for the development of cost-effective oxy-fuel technologies and the minimization of environmental concerns at industrial scale. The coupling of detailed chemistry models and CFD simulations is still challenging, especially for large-scale plants, because of the high computational efforts required. The development of scale-bridging models is therefore necessary, to find a good compromise between computational efforts and the physical-chemical modeling precision. This paper presents a procedure for scale-bridging modeling of coal devolatilization, inmore » the presence of experimental error, that puts emphasis on the thermodynamic aspect of devolatilization, namely the final volatile yield of coal, rather than kinetics. The procedure consists of an engineering approach based on dataset consistency and Bayesian methodology including Gaussian-Process Regression (GPR). Experimental data from devolatilization tests carried out in an oxy-coal entrained flow reactor were considered and CFD simulations of the reactor were performed. Jointly evaluating experiments and simulations, a novel yield model was validated against the data via consistency analysis. In parallel, a Gaussian-Process Regression was performed, to improve the understanding of the uncertainty associated to the devolatilization, based on the experimental measurements. Potential model forms that could predict yield during devolatilization were obtained. The set of model forms obtained via GPR includes the yield model that was proven to be consistent with the data. Finally, the overall procedure has resulted in a novel yield model for coal devolatilization and in a valuable evaluation of uncertainty in the data, in the model form, and in the model parameters.« less

  7. Collaborative simulations and experiments for a novel yield model of coal devolatilization in oxy-coal combustion conditions

    DOE PAGES

    Iavarone, Salvatore; Smith, Sean T.; Smith, Philip J.; ...

    2017-06-03

    Oxy-coal combustion is an emerging low-cost “clean coal” technology for emissions reduction and Carbon Capture and Sequestration (CCS). The use of Computational Fluid Dynamics (CFD) tools is crucial for the development of cost-effective oxy-fuel technologies and the minimization of environmental concerns at industrial scale. The coupling of detailed chemistry models and CFD simulations is still challenging, especially for large-scale plants, because of the high computational efforts required. The development of scale-bridging models is therefore necessary, to find a good compromise between computational efforts and the physical-chemical modeling precision. This paper presents a procedure for scale-bridging modeling of coal devolatilization, inmore » the presence of experimental error, that puts emphasis on the thermodynamic aspect of devolatilization, namely the final volatile yield of coal, rather than kinetics. The procedure consists of an engineering approach based on dataset consistency and Bayesian methodology including Gaussian-Process Regression (GPR). Experimental data from devolatilization tests carried out in an oxy-coal entrained flow reactor were considered and CFD simulations of the reactor were performed. Jointly evaluating experiments and simulations, a novel yield model was validated against the data via consistency analysis. In parallel, a Gaussian-Process Regression was performed, to improve the understanding of the uncertainty associated to the devolatilization, based on the experimental measurements. Potential model forms that could predict yield during devolatilization were obtained. The set of model forms obtained via GPR includes the yield model that was proven to be consistent with the data. Finally, the overall procedure has resulted in a novel yield model for coal devolatilization and in a valuable evaluation of uncertainty in the data, in the model form, and in the model parameters.« less

  8. Growing Larger Crystals for Neutron Diffraction

    NASA Technical Reports Server (NTRS)

    Pusey, Marc

    2003-01-01

    Obtaining crystals of suitable size and high quality has been a major bottleneck in macromolecular crystallography. With the advent of advanced X-ray sources and methods the question of size has rapidly dwindled, almost to the point where if one can see the crystal then it was big enough. Quality is another issue, and major national and commercial efforts were established to take advantage of the microgravity environment in an effort to obtain higher quality crystals. Studies of the macromolecule crystallization process were carried out in many labs in an effort to understand what affected the resultant crystal quality on Earth, and how microgravity improved the process. While technological improvements are resulting in a diminishing of the minimum crystal size required, neutron diffraction structural studies still require considerably larger crystals, by several orders of magnitude, than X-ray studies. From a crystal growth physics perspective there is no reason why these 'large' crystals cannot be obtained: the question is generally more one of supply than limitations mechanism. This talk will discuss our laboratory s current model for macromolecule crystal growth, with highlights pertaining to the growth of crystals suitable for neutron diffraction studies.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Christopher J.; Freels, James D.; Hobbs, Randy W.

    There has been a considerable effort over the previous few years to demonstrate and optimize the production of plutonium-238 ( 238Pu) at the High Flux Isotope Reactor (HFIR). This effort has involved resources from multiple divisions and facilities at the Oak Ridge National Laboratory (ORNL) to demonstrate the fabrication, irradiation, and chemical processing of targets containing neptunium-237 ( 237Np) dioxide (NpO 2)/aluminum (Al) cermet pellets. A critical preliminary step to irradiation at the HFIR is to demonstrate the safety of the target under irradiation via documented experiment safety analyses. The steady-state thermal safety analyses of the target are simulated inmore » a finite element model with the COMSOL Multiphysics code that determines, among other crucial parameters, the limiting maximum temperature in the target. Safety analysis efforts for this model discussed in the present report include: (1) initial modeling of single and reduced-length pellet capsules in order to generate an experimental knowledge base that incorporate initial non-linear contact heat transfer and fission gas equations, (2) modeling efforts for prototypical designs of partially loaded and fully loaded targets using limited available knowledge of fabrication and irradiation characteristics, and (3) the most recent and comprehensive modeling effort of a fully coupled thermo-mechanical approach over the entire fully loaded target domain incorporating burn-up dependent irradiation behavior and measured target and pellet properties, hereafter referred to as the production model. These models are used to conservatively determine several important steady-state parameters including target stresses and temperatures, the limiting condition of which is the maximum temperature with respect to the melting point. The single pellet model results provide a basis for the safety of the irradiations, followed by parametric analyses in the initial prototypical designs that were necessary due to the limiting fabrication and irradiation data available. The calculated parameters in the final production target model are the most accurate and comprehensive, while still conservative. Over 210 permutations in irradiation time and position were evaluated, and are supported by the most recent inputs and highest fidelity methodology. The results of these analyses show that the models presented in this report provide a robust and reliable basis for previous, current and future experiment safety analyses. In addition, they reveal an evolving knowledge of the steady-state behavior of the NpO 2/Al pellets under irradiation for a variety of target encapsulations and potential conditions.« less

  10. Report of work done for technical assistance agreement 1269 between Sandia National Laboratories and the Watkins-Johnson Company: Chemical reaction mechanisms for computational models of SiO{sub 2} CVD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, P.; Johannes, J.; Kudriavtsev, V.

    The use of computational modeling to improve equipment and process designs for chemical vapor deposition (CVD) reactors is becoming increasingly common. Commercial codes are available that facilitate the modeling of chemically-reacting flows, but chemical reaction mechanisms must be separately developed for each system of interest. One f the products of the Watkins-Johnson Company (WJ) is a reactor marketed to semiconductor manufacturers for the atmospheric-pressure chemical vapor deposition (APCVD) of silicon oxide films. In this process, TEOS (tetraethoxysilane, Si(OC{sub 2}H{sub 5}){sub 4}) and ozone (O{sub 3}) are injected (in nitrogen and oxygen carrier gases) over hot silicon wafers that are beingmore » carried through the system on a moving belt. As part of their equipment improvement process, WJ is developing computational models of this tool. In this effort, they are collaborating with Sandia National Laboratories (SNL) to draw on Sandia`s experience base in understanding and modeling the chemistry of CVD processes.« less

  11. Support of surgical process modeling by using adaptable software user interfaces

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Kaschek, B.; Czygan, M.; Goldstein, D.; Strauß, G.; Meixensberger, J.; Burgert, O.

    2010-03-01

    Surgical Process Modeling (SPM) is a powerful method for acquiring data about the evolution of surgical procedures. Surgical Process Models are used in a variety of use cases including evaluation studies, requirements analysis and procedure optimization, surgical education, and workflow management scheme design. This work proposes the use of adaptive, situation-aware user interfaces for observation support software for SPM. We developed a method to support the modeling of the observer by using an ontological knowledge base. This is used to drive the graphical user interface for the observer to restrict the search space of terminology depending on the current situation. In the evaluation study it is shown, that the workload of the observer was decreased significantly by using adaptive user interfaces. 54 SPM observation protocols were analyzed by using the NASA Task Load Index and it was shown that the use of the adaptive user interface disburdens the observer significantly in workload criteria effort, mental demand and temporal demand, helping him to concentrate on his essential task of modeling the Surgical Process.

  12. Neural Signatures of Value Comparison in Human Cingulate Cortex during Decisions Requiring an Effort-Reward Trade-off

    PubMed Central

    Kennerley, Steven W.; Friston, Karl; Bestmann, Sven

    2016-01-01

    Integrating costs and benefits is crucial for optimal decision-making. Although much is known about decisions that involve outcome-related costs (e.g., delay, risk), many of our choices are attached to actions and require an evaluation of the associated motor costs. Yet how the brain incorporates motor costs into choices remains largely unclear. We used human fMRI during choices involving monetary reward and physical effort to identify brain regions that serve as a choice comparator for effort-reward trade-offs. By independently varying both options' effort and reward levels, we were able to identify the neural signature of a comparator mechanism. A network involving supplementary motor area and the caudal portion of dorsal anterior cingulate cortex encoded the difference in reward (positively) and effort levels (negatively) between chosen and unchosen choice options. We next modeled effort-discounted subjective values using a novel behavioral model. This revealed that the same network of regions involving dorsal anterior cingulate cortex and supplementary motor area encoded the difference between the chosen and unchosen options' subjective values, and that activity was best described using a concave model of effort-discounting. In addition, this signal reflected how precisely value determined participants' choices. By contrast, separate signals in supplementary motor area and ventromedial prefrontal cortex correlated with participants' tendency to avoid effort and seek reward, respectively. This suggests that the critical neural signature of decision-making for choices involving motor costs is found in human cingulate cortex and not ventromedial prefrontal cortex as typically reported for outcome-based choice. Furthermore, distinct frontal circuits seem to drive behavior toward reward maximization and effort minimization. SIGNIFICANCE STATEMENT The neural processes that govern the trade-off between expected benefits and motor costs remain largely unknown. This is striking because energetic requirements play an integral role in our day-to-day choices and instrumental behavior, and a diminished willingness to exert effort is a characteristic feature of a range of neurological disorders. We use a new behavioral characterization of how humans trade off reward maximization with effort minimization to examine the neural signatures that underpin such choices, using BOLD MRI neuroimaging data. We find the critical neural signature of decision-making, a signal that reflects the comparison of value between choice options, in human cingulate cortex, whereas two distinct brain circuits drive behavior toward reward maximization or effort minimization. PMID:27683898

  13. Neural Signatures of Value Comparison in Human Cingulate Cortex during Decisions Requiring an Effort-Reward Trade-off.

    PubMed

    Klein-Flügge, Miriam C; Kennerley, Steven W; Friston, Karl; Bestmann, Sven

    2016-09-28

    Integrating costs and benefits is crucial for optimal decision-making. Although much is known about decisions that involve outcome-related costs (e.g., delay, risk), many of our choices are attached to actions and require an evaluation of the associated motor costs. Yet how the brain incorporates motor costs into choices remains largely unclear. We used human fMRI during choices involving monetary reward and physical effort to identify brain regions that serve as a choice comparator for effort-reward trade-offs. By independently varying both options' effort and reward levels, we were able to identify the neural signature of a comparator mechanism. A network involving supplementary motor area and the caudal portion of dorsal anterior cingulate cortex encoded the difference in reward (positively) and effort levels (negatively) between chosen and unchosen choice options. We next modeled effort-discounted subjective values using a novel behavioral model. This revealed that the same network of regions involving dorsal anterior cingulate cortex and supplementary motor area encoded the difference between the chosen and unchosen options' subjective values, and that activity was best described using a concave model of effort-discounting. In addition, this signal reflected how precisely value determined participants' choices. By contrast, separate signals in supplementary motor area and ventromedial prefrontal cortex correlated with participants' tendency to avoid effort and seek reward, respectively. This suggests that the critical neural signature of decision-making for choices involving motor costs is found in human cingulate cortex and not ventromedial prefrontal cortex as typically reported for outcome-based choice. Furthermore, distinct frontal circuits seem to drive behavior toward reward maximization and effort minimization. The neural processes that govern the trade-off between expected benefits and motor costs remain largely unknown. This is striking because energetic requirements play an integral role in our day-to-day choices and instrumental behavior, and a diminished willingness to exert effort is a characteristic feature of a range of neurological disorders. We use a new behavioral characterization of how humans trade off reward maximization with effort minimization to examine the neural signatures that underpin such choices, using BOLD MRI neuroimaging data. We find the critical neural signature of decision-making, a signal that reflects the comparison of value between choice options, in human cingulate cortex, whereas two distinct brain circuits drive behavior toward reward maximization or effort minimization. Copyright © 2016 Klein-Flügge et al.

  14. Final report for LDRD project 11-0783 : directed robots for increased military manpower effectiveness.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohrer, Brandon Robinson; Rothganger, Fredrick H.; Wagner, John S.

    The purpose of this LDRD is to develop technology allowing warfighters to provide high-level commands to their unmanned assets, freeing them to command a group of them or commit the bulk of their attention elsewhere. To this end, a brain-emulating cognition and control architecture (BECCA) was developed, incorporating novel and uniquely capable feature creation and reinforcement learning algorithms. BECCA was demonstrated on both a mobile manipulator platform and on a seven degree of freedom serial link robot arm. Existing military ground robots are almost universally teleoperated and occupy the complete attention of an operator. They may remove a soldier frommore » harm's way, but they do not necessarily reduce manpower requirements. Current research efforts to solve the problem of autonomous operation in an unstructured, dynamic environment fall short of the desired performance. In order to increase the effectiveness of unmanned vehicle (UV) operators, we proposed to develop robots that can be 'directed' rather than remote-controlled. They are instructed and trained by human operators, rather than driven. The technical approach is modeled closely on psychological and neuroscientific models of human learning. Two Sandia-developed models are utilized in this effort: the Sandia Cognitive Framework (SCF), a cognitive psychology-based model of human processes, and BECCA, a psychophysical-based model of learning, motor control, and conceptualization. Together, these models span the functional space from perceptuo-motor abilities, to high-level motivational and attentional processes.« less

  15. Markov Modeling of Component Fault Growth over a Derived Domain of Feasible Output Control Effort Modifications

    NASA Technical Reports Server (NTRS)

    Bole, Brian; Goebel, Kai; Vachtsevanos, George

    2012-01-01

    This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of prognostics-based control adaptation. A metric representing the relative deviation between the nominal output of a system and the net output that is actually enacted by an implemented prognostics-based control routine, will be used to define the action space of the formulated Markov process. The state space of the Markov process will be defined in terms of an abstracted metric representing the relative health remaining in each of the system s components. The proposed formulation of component fault dynamics will conveniently relate feasible system output performance modifications to predictions of future component health deterioration.

  16. Running Head: Implementing Six Sigma Efforts

    ERIC Educational Resources Information Center

    Lindsay, Jamie Eleaitia Mae

    2005-01-01

    Six Sigma is an organization wide program that provides common set of goals, language, and methodology for improving the overall quality of the processes within the organization (Davis & Heineke 2004). Six Sigma main concern is for the customer. What will the customers want? Need? Six Sigma has a model that helps Sigma get implemented DMAIC model…

  17. A Strategic Model to Address Issues of Student Achievement

    ERIC Educational Resources Information Center

    Fontana, Leonard; Johnson, Elease; Green, Peggy; Macia, Jose; Wright, Ted; Daniel, Yanick; Distefano Diaz, Mary F.; Obenauf, Steve

    2006-01-01

    This article describes an interactive and collaborative strategic planning process by a community college in which student retention and success became a focus of a re-accreditation endeavor. The underlying assumption of this strategic planning effort was that engaging all groups that have a stake in student retention at the beginning of the…

  18. Cultural Adaptation of the Strengthening Families Program 10-14 to Italian Families

    ERIC Educational Resources Information Center

    Ortega, Enrique; Giannotta, Fabrizia; Latina, Delia; Ciairano, Silvia

    2012-01-01

    Background: The family context has proven to be a useful target in which to apply prevention efforts aimed at child and adolescent health risk behaviors. There are currently a variety of cultural adaptation models that serve to guide the international adaptation of intervention programs. Objective: The cultural adaptation process and program…

  19. Partnering through Training and Practice to Achieve Performance Improvement

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2010-01-01

    This article presents a partnership effort among managers, trainers, and employees to spring to life performance improvement using the performance templates (P-T) approach. P-T represents a process model as well as a method of training leading to performance improvement. Not only does it add to our repertoire of training and performance management…

  20. Drawing the Line: The Cultural Cartography of Utilization Recommendations for Mental Health Problems

    ERIC Educational Resources Information Center

    Olafsdottir, Sigrun; Pescosolido, Bernice A.

    2009-01-01

    In the 1990s, sociologists began to rethink the failure of utilization models to explain whether and why individuals accessed formal treatment systems. This effort focused on reconceptualizing the underlying assumptions and processes that shaped utilization patterns. While we have built a better understanding of how social networks structure…

  1. Environmental Education and Networking in Mafeteng Primary Schools: A Participatory Approach

    ERIC Educational Resources Information Center

    Bitso, Constance

    2006-01-01

    This paper explores a participatory process of Environmental Education (EE) networking in Mafeteng primary schools. It gives an overview of the existing EE efforts in Lesotho, particularly the models schools of the National Curriculum Development Centre. It also provides information about Lesotho Environmental Information Network as the body that…

  2. Possibilities: A Framework for Modeling Students' Deductive Reasoning in Physics

    ERIC Educational Resources Information Center

    Gaffney, Jonathan David Housley

    2010-01-01

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning…

  3. Study Abroad: The Reality of Building Dynamic Group Learning.

    ERIC Educational Resources Information Center

    Ransbury, Molly K.; Harris, Sandra A.

    1994-01-01

    The collaborative effort of a professor of human development with expertise in group process and a general education professor with expertise in Greek mythology and culture uses a case study format to apply theoretical models of group dynamics to the travel and learning experience of study abroad. Implications for course design and group process…

  4. Indicators of Informal and Formal Decision-Making about a Socioscientific Issue

    ERIC Educational Resources Information Center

    Dauer, Jenny M.; Lute, Michelle L.; Straka, Olivia

    2017-01-01

    We propose two contrasting types of student decision-making based on social and cognitive psychology models of separate mental processes for problem solving. Informal decision-making uses intuitive reasoning and is subject to cognitive biases, whereas formal decision-making uses effortful, logical reasoning. We explored indicators of students'…

  5. They're Hiring in Hong Kong

    ERIC Educational Resources Information Center

    Hvistendahl, Mara

    2009-01-01

    Over the past several years, Hong Kong has made a determined effort to raise its profile by positioning its universities to compete globally for students, scholars, and research projects. In the process, it is refashioning its higher-education system from the British three-year model into a four-year system aligned with those of the United States…

  6. Non-Effective National Territory: A Characteristic of Third World States.

    ERIC Educational Resources Information Center

    Walter, Bob J.

    In an effort to improve understanding and to provide better solutions to the world's political problems, this paper examines national territory or states in terms of their functional processes and their spatial structures. Examples from Third World states are provided. The author first presents a model of political territory. It has a boundary…

  7. Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2008-01-01

    The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…

  8. Rethinking the Measurement of Training and Development in the Professions: A Conceptual Model

    ERIC Educational Resources Information Center

    Lynch, Doug; Thomas, Chris; Green, Wendy; Gottfried, Michael; Varga, Matthew

    2010-01-01

    The 21st century is often called the "age of talent." Globalization has influenced both organizational processes and employee training, creating an increased need for educated, skilled, and adaptable employees. Training and development has become an integral part of most organizations' efforts to develop and maintain competitive…

  9. A decision modeling for phasor measurement unit location selection in smart grid systems

    NASA Astrophysics Data System (ADS)

    Lee, Seung Yup

    As a key technology for enhancing the smart grid system, Phasor Measurement Unit (PMU) provides synchronized phasor measurements of voltages and currents of wide-area electric power grid. With various benefits from its application, one of the critical issues in utilizing PMUs is the optimal site selection of units. The main aim of this research is to develop a decision support system, which can be used in resource allocation task for smart grid system analysis. As an effort to suggest a robust decision model and standardize the decision modeling process, a harmonized modeling framework, which considers operational circumstances of component, is proposed in connection with a deterministic approach utilizing integer programming. With the results obtained from the optimal PMU placement problem, the advantages and potential that the harmonized modeling process possesses are assessed and discussed.

  10. Basic Modeling of the Solar Atmosphere and Spectrum

    NASA Technical Reports Server (NTRS)

    Avrett, Eugene H.; Wagner, William J. (Technical Monitor)

    2000-01-01

    During the last three years we have continued the development of extensive computer programs for constructing realistic models of the solar atmosphere and for calculating detailed spectra to use in the interpretation of solar observations. This research involves two major interrelated efforts: work by Avrett and Loeser on the Pandora computer program for optically thick non-LTE modeling of the solar atmosphere including a wide range of physical processes, and work by Kurucz on the detailed high-resolution synthesis of the solar spectrum using data for over 58 million atomic and molecular lines. Our objective is to construct atmospheric models from which the calculated spectra agree as well as possible with high-and low-resolution observations over a wide wavelength range. Such modeling leads to an improved understanding of the physical processes responsible for the structure and behavior of the atmosphere.

  11. Cancer growth and metastasis as a metaphor of Go gaming: An Ising model approach.

    PubMed

    Barradas-Bautista, Didier; Alvarado-Mentado, Matias; Agostino, Mark; Cocho, Germinal

    2018-01-01

    This work aims for modeling and simulating the metastasis of cancer, via the analogy between the cancer process and the board game Go. In the game of Go, black stones that play first could correspond to a metaphor of the birth, growth, and metastasis of cancer. Moreover, playing white stones on the second turn could correspond the inhibition of cancer invasion. Mathematical modeling and algorithmic simulation of Go may therefore benefit the efforts to deploy therapies to surpass cancer illness by providing insight into the cellular growth and expansion over a tissue area. We use the Ising Hamiltonian, that models the energy exchange in interacting particles, for modeling the cancer dynamics. Parameters in the energy function refer the biochemical elements that induce cancer birth, growth, and metastasis; as well as the biochemical immune system process of defense.

  12. Progress Toward Modeling Spectroscopic Signatures of Mix on Omega and NIF

    NASA Astrophysics Data System (ADS)

    Tregillis, I. L.; Schmitt, M. J.; Hsu, S. C.; Wysocki, F. J.; Cobble, J. A.; Murphy, T. J.

    2011-10-01

    Defect-induced mix processes may degrade the performance of ICF and ICF-like targets at Omega and NIF. An improved understanding of the relevant physics requires an experimental program built on a foundation of radiation-hydrodynamic simulations plus reliable synthetic diagnostic outputs. To that end, the Applications of Ignition (AoI) and Defect Implosion Experiment (DIME) efforts at LANL have focused on directly driven plastic capsules containing high-Z dopants and manufactured with an equatorial ``trench'' defect. One of the key diagnostic techniques for detecting and diagnosing the migration of dopant material into the hot core is Multi-Monochromatic X-ray Imaging (MMI). This talk will focus on recent efforts to model spectroscopic signatures of mix processes in AoI/DIME capsules via simulated MMI-type diagnostic instruments. It will also include data from recent Omega shots and calculations in support of Tier 1 experiments at NIF in FY2012. This work is supported by US DOE/NNSA, performed at LANL, operated by LANS LLC under contract DE-AC52-06NA25396.

  13. Optical Measurements at the Combustor Exit of the HIFiRE 2 Ground Test Engine

    NASA Technical Reports Server (NTRS)

    Brown, Michael S.; Herring, Gregory C.; Cabell, Karen; Hass, Neal; Barhorst, Todd F.; Gruber, Mark

    2012-01-01

    The development of optical techniques capable of measuring in-stream flow properties of air breathing hypersonic engines is a goal of the Aerospace Propulsion Division at AFRL. Of particular interest are techniques such as tunable diode laser absorption spectroscopy that can be implemented in both ground and flight test efforts. We recently executed a measurement campaign at the exit of the combustor of the HIFiRE 2 ground test engine during Phase II operation of the engine. Data was collected in anticipation of similar data sets to be collected during the flight experiment. The ground test optical data provides a means to evaluate signal processing algorithms particularly those associated with limited line of sight tomography. Equally important, this in-stream data was collected to compliment data acquired with surface-mounted instrumentation and the accompanying flowpath modeling efforts-both CFD and lower order modeling. Here we discuss the specifics of hardware and data collection along with a coarse-grained look at the acquired data and our approach to processing and analyzing it.

  14. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  15. Rolling Process Modeling Report: Finite-Element Prediction of Roll Separating Force and Rolling Defects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soulami, Ayoub; Lavender, Curt A.; Paxton, Dean M.

    2014-04-23

    Pacific Northwest National Laboratory (PNNL) has been investigating manufacturing processes for the uranium-10% molybdenum (U-10Mo) alloy plate-type fuel for the U.S. high-performance research reactors. This work supports the Convert Program of the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA) Global Threat Reduction Initiative. This report documents modeling results of PNNL’s efforts to perform finite-element simulations to predict roll separating forces and rolling defects. Simulations were performed using a finite-element model developed using the commercial code LS-Dyna. Simulations of the hot rolling of U-10Mo coupons encapsulated in low-carbon steel have been conducted following two different schedules. Model predictions ofmore » the roll-separation force and roll-pack thicknesses at different stages of the rolling process were compared with experimental measurements. This report discusses various attributes of the rolled coupons revealed by the model (e.g., dog-boning and thickness non-uniformity).« less

  16. Utilizing Controlled Vibrations in a Microgravity Environment to Understand and Promote Microstructural Homogeneity During Floating-Zone Crystal Growth

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N.

    1999-01-01

    It has been demonstrated in floating-zone configurations utilizing silicone oil and nitrate salts that mechanically induced vibration effectively minimizes detrimental, gravity independent, thermocapillary flow. The processing parameters leading to crystal improvement and aspects of the on-going modeling effort are discussed. Plans for applying the crystal growth technique to commercially relevant materials, e.g., silicon, as well as the value of processing in a microgravity environment are presented.

  17. Cross Directorate Proposal: Nanostructured Materials for Munitions and Propellants-Production, Modeling, and Characterization

    DTIC Science & Technology

    2016-07-15

    towards hydration and decomposition along with probing their hydration mechanisms, we are now exploring processing and deposition effects for this...oxidizer films and tested for their reactivity. Hydration Mechanism for HI3O8 → HIO3 Previous efforts by our group investigating the hydration ...mechanism of I2O5 → HI3O8 reflected that the hydration mechanism proceeded through a nucleation and growth process followed by a diffusion limited

  18. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  19. A physiome interoperability roadmap for personalized drug development

    PubMed Central

    2016-01-01

    The goal of developing therapies and dosage regimes for characterized subgroups of the general population can be facilitated by the use of simulation models able to incorporate information about inter-individual variability in drug disposition (pharmacokinetics), toxicity and response effect (pharmacodynamics). Such observed variability can have multiple causes at various scales, ranging from gross anatomical differences to differences in genome sequence. Relevant data for many of these aspects, particularly related to molecular assays (known as ‘-omics’), are available in online resources, but identification and assignment to appropriate model variables and parameters is a significant bottleneck in the model development process. Through its efforts to standardize annotation with consequent increase in data usability, the human physiome project has a vital role in improving productivity in model development and, thus, the development of personalized therapy regimes. Here, we review the current status of personalized medicine in clinical practice, outline some of the challenges that must be overcome in order to expand its applicability, and discuss the relevance of personalized medicine to the more widespread challenges being faced in drug discovery and development. We then review some of (i) the key data resources available for use in model development and (ii) the potential areas where advances made within the physiome modelling community could contribute to physiologically based pharmacokinetic and physiologically based pharmacokinetic/pharmacodynamic modelling in support of personalized drug development. We conclude by proposing a roadmap to further guide the physiome community in its on-going efforts to improve data usability, and integration with modelling efforts in the support of personalized medicine development. PMID:27051513

  20. Towards improved capability and confidence in coupled atmospheric and wildland fire modeling

    NASA Astrophysics Data System (ADS)

    Sauer, Jeremy A.

    This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.

  1. Sensitivity Analysis Reveals Critical Factors that Affect Wetland Methane Emissions using Soil Biogeochemistry Model

    NASA Astrophysics Data System (ADS)

    Alonso-Contes, C.; Gerber, S.; Bliznyuk, N.; Duerr, I.

    2017-12-01

    Wetlands contribute approximately 20 to 40 % to global sources of methane emissions. We build a Methane model for tropical and subtropical forests, that allows inundated conditions, following the approaches used in more complex global biogeochemical emission models (LPJWhyMe and CLM4Me). The model was designed to replace model formulations with field and remotely sensed collected data for 2 essential drivers: plant productivity and hydrology. This allows us to directly focus on the central processes of methane production, consumption and transport. One of our long term goals is to make the model available to a scientists interested in including methane modeling in their location of study. Sensitivity analysis results help in focusing field data collection efforts. Here, we present results from a pilot global sensitivity analysis of the model order to determine which parameters and processes contribute most to the model's uncertainty of methane emissions. Results show that parameters related to water table behavior, carbon input (in form of plant productivity) and rooting depth affect simulated methane emissions the most. Current efforts include to perform the sensitivity analysis again on methane emissions outputs from an updated model that incorporates a soil heat flux routine and to determine the extent by which the soil temperature parameters affect CH4 emissions. Currently we are conducting field collection of data during Summer 2017 for comparison among 3 different landscapes located in the Ordway-Swisher Biological Station in Melrose, FL. We are collecting soil moisture and CH4 emission data from 4 different wetland types. Having data from 4 wetland types allows for calibration of the model to diverse soil, water and vegetation characteristics.

  2. Accessing the inaccessible: making (successful) field observations at tidewater glacier termini

    NASA Astrophysics Data System (ADS)

    Kienholz, C.; Amundson, J. M.; Jackson, R. H.; Motyka, R. J.; Nash, J. D.; Sutherland, D.

    2017-12-01

    Glaciers terminating in ocean water (tidewater glaciers) show complex dynamic behavior driven predominantly by processes at the ice-ocean interface (sedimentation, erosion, iceberg calving, submarine melting). A quantitative understanding of these processes is required, for example, to better assess tidewater glaciers' fate in our rapidly warming environment. Lacking observations close to glacier termini, due to unpredictable risks from calving, hamper this understanding. In an effort to remedy this lack of knowledge, we initiated a large field-based effort at LeConte Glacier, southeast Alaska, in 2016. LeConte Glacier is a regional analog for many tidewater glaciers, but better accessible and observable and thus an ideal target for our multi-disciplinary effort. Our ongoing campaigns comprise measurements from novel autonomous vessels (temperature, salinity and current) in the immediate proximity of the glacier terminus and additional surveys (including multibeam bathymetry) from boats and moorings in the proglacial fjord. These measurements are complemented by iceberg and glacier velocity measurements from time lapse cameras and a portable radar interferometer situated above LeConte Bay. GPS-based velocity observations and melt measurements are conducted on the glacier. These measurements provide necessary input for process-based understanding and numerical modeling of the glacier and fjord systems. In the presentation, we discuss promising initial results and lessons learned from the campaign.

  3. A Holistic Approach to Systems Development

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.

    2008-01-01

    Introduces a Holistic and Iterative Design Process. Continuous process but can be loosely divided into four stages. More effort spent early on in the design. Human-centered and Multidisciplinary. Emphasis on Life-Cycle Cost. Extensive use of modeling, simulation, mockups, human subjects, and proven technologies. Human-centered design doesn t mean the human factors discipline is the most important Disciplines should be involved in the design: Subsystem vendors, configuration management, operations research, manufacturing engineering, simulation/modeling, cost engineering, hardware engineering, software engineering, test and evaluation, human factors, electromagnetic compatibility, integrated logistics support, reliability/maintainability/availability, safety engineering, test equipment, training systems, design-to-cost, life cycle cost, application engineering etc. 9

  4. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  5. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  6. Prioritizing Arctic Observations with Limited Resources

    NASA Astrophysics Data System (ADS)

    Kelly, B.; Starkweather, S.

    2012-12-01

    U.S. Federal agencies recently completed a five-year research plan for the Arctic including plans to enhance efforts toward an Arctic Observing Network (AON). Following on numerous national and international planning efforts, the five-year plan identifies nine priority areas including enhancing observing system design, assessing priorities of local residents, and improving data access. AON progress to date has been realized through bottom-up funding decisions and some top-down design optimization approaches, which have resulted in valuable yet ad hoc progress towards Arctic research imperatives. We suggest that advancing AON beyond theoretical design and ad hoc efforts with the engagement of multiple U.S. Federal agencies will require a structured, input-based planning approach to prioritization that recognizes budget realities. Completing a long list of worthy observing efforts appears to be unsustainable and inadequate in responding to the rapid changes taking place in the Arctic. Society would be better served by more rapid implementation of sustained, long-term observations focused on those climate feedbacks with the greatest potential negative impacts. Several emerging theoretical frameworks have pointed to the need to enhance iterative, capacity-building dialog between observationalists, modelers, and stakeholders as a way to identify these broadest potential benefits. We concur and suggest that those dialogs need to be facilitated and sustained over long periods. Efforts to isolate observational programs from process research are, we believe, impeding progress. At the same time, we note that bottom-up funding decisions, while useful for prioritizing process research, are less appropriate to building observing systems.

  7. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1975-01-01

    Principal water resources users were surveyed to determine the impact of remote data streams on hydrologic computer models. Analysis of responses demonstrated that: most water resources effort suitable to remote sensing inputs is conducted through federal agencies or through federally stimulated research; and, most hydrologic models suitable to remote sensing data are federally developed. Computer usage by major water resources users was analyzed to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era.

  8. Design and Analysis of a Preconcentrator for the ChemLab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WONG,CHUNGNIN C.; FLEMMING,JEB H.; MANGINELL,RONALD P.

    2000-07-17

    Preconcentration is a critical analytical procedure when designing a microsystem for trace chemical detection, because it can purify a sample mixture and boost the small analyte concentration to a much higher level allowing a better analysis. This paper describes the development of a micro-fabricated planar preconcentrator for the {mu}ChemLab{trademark} at Sandia. To guide the design, an analytical model to predict the analyte transport, adsorption and resorption process in the preconcentrator has been developed. Experiments have also been conducted to analyze the adsorption and resorption process and to validate the model. This combined effort of modeling, simulation, and testing has ledmore » us to build a reliable, efficient preconcentrator with good performance.« less

  9. Perceptual asymmetry in texture perception.

    PubMed

    Williams, D; Julesz, B

    1992-07-15

    A fundamental property of human visual perception is our ability to distinguish between textures. A concerted effort has been made to account for texture segregation in terms of linear spatial filter models and their nonlinear extensions. However, for certain texture pairs the ease of discrimination changes when the role of figure and ground are reversed. This asymmetry poses a problem for both linear and nonlinear models. We have isolated a property of texture perception that can account for this asymmetry in discrimination: subjective closure. This property, which is also responsible for visual illusions, appears to be explainable by early visual processes alone. Our results force a reexamination of the process of human texture segregation and of some recent models that were introduced to explain it.

  10. Why are you telling me that? A conceptual model of the social function of autobiographical memory.

    PubMed

    Alea, Nicole; Bluck, Susan

    2003-03-01

    In an effort to stimulate and guide empirical work within a functional framework, this paper provides a conceptual model of the social functions of autobiographical memory (AM) across the lifespan. The model delineates the processes and variables involved when AMs are shared to serve social functions. Components of the model include: lifespan contextual influences, the qualitative characteristics of memory (emotionality and level of detail recalled), the speaker's characteristics (age, gender, and personality), the familiarity and similarity of the listener to the speaker, the level of responsiveness during the memory-sharing process, and the nature of the social relationship in which the memory sharing occurs (valence and length of the relationship). These components are shown to influence the type of social function served and/or, the extent to which social functions are served. Directions for future empirical work to substantiate the model and hypotheses derived from the model are provided.

  11. Using logic models in a community-based agricultural injury prevention project.

    PubMed

    Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie

    2009-01-01

    The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.

  12. Stratiform chromite deposit model

    USGS Publications Warehouse

    Schulte, Ruth F.; Taylor, Ryan D.; Piatak, Nadine M.; Seal, Robert R.

    2010-01-01

    Stratiform chromite deposits are of great economic importance, yet their origin and evolution remain highly debated. Layered igneous intrusions such as the Bushveld, Great Dyke, Kemi, and Stillwater Complexes, provide opportunities for studying magmatic differentiation processes and assimilation within the crust, as well as related ore-deposit formation. Chromite-rich seams within layered intrusions host the majority of the world's chromium reserves and may contain significant platinum-group-element (PGE) mineralization. This model of stratiform chromite deposits is part of an effort by the U.S. Geological Survey's Mineral Resources Program to update existing models and develop new descriptive mineral deposit models to supplement previously published models for use in mineral-resource and mineral-environmental assessments. The model focuses on features that may be common to all stratiform chromite deposits as a way to gain insight into the processes that gave rise to their emplacement and to the significant economic resources contained in them.

  13. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  14. Resolving Some Paradoxes in the Thermal Decomposition Mechanism of Acetaldehyde

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivaramakrishnan, Raghu; Michael, Joe V.; Harding, Lawrence B.

    2015-07-16

    The mechanism for the thermal decomposition of acetaldehyde has been revisited with an analysis of literature kinetics experiments using theoretical kinetics. The present modeling study was motivated by recent observations, with very sensitive diagnostics, of some unexpected products in high temperature micro-tubular reactor experiments on the thermal decomposition of CH3CHO and its deuterated analogs, CH3CDO, CD3CHO, and CD3CDO. The observations of these products prompted the authors of these studies to suggest that the enol tautomer, CH2CHOH (vinyl alcohol), is a primary intermediate in the thermal decomposition of acetaldehyde. The present modeling efforts on acetaldehyde decomposition incorporate a master equation re-analysismore » of the CH3CHO potential energy surface (PES). The lowest energy process on this PES is an isomerization of CH3CHO to CH2CHOH. However, the subsequent product channels for CH2CHOH are substantially higher in energy, and the only unimolecular process that can be thermally accessed is a re-isomerization to CH3CHO. The incorporation of these new theoretical kinetics predictions into models for selected literature experiments on CH3CHO thermal decomposition confirms our earlier experiment and theory based conclusions that the dominant decomposition process in CH3CHO at high temperatures is C-C bond fission with a minor contribution (~10-20%) from the roaming mechanism to form CH4 and CO. The present modeling efforts also incorporate a master-equation analysis of the H + CH2CHOH potential energy surface. This bimolecular reaction is the primary mechanism for removal of CH2CHOH, which can accumulate to minor amounts at high temperatures, T > 1000 K, in most lab-scale experiments that use large initial concentrations of CH3CHO. Our modeling efforts indicate that the observation of ketene, water and acetylene in the recent micro-tubular experiments are primarily due to bimolecular reactions of CH3CHO and CH2CHOH with H-atoms, and have no bearing on the unimolecular decomposition mechanism of CH3CHO. The present simulations also indicate that experiments using these micro-tubular reactors when interpreted with the aid of high-level theoretical calculations and kinetics modeling can offer insights into the chemistry of elusive intermediates in high temperature pyrolysis of organic molecules.« less

  15. Activated sludge model (ASM) based modelling of membrane bioreactor (MBR) processes: a critical review with special regard to MBR specificities.

    PubMed

    Fenu, A; Guglielmi, G; Jimenez, J; Spèrandio, M; Saroj, D; Lesjean, B; Brepols, C; Thoeye, C; Nopens, I

    2010-08-01

    Membrane bioreactors (MBRs) have been increasingly employed for municipal and industrial wastewater treatment in the last decade. The efforts for modelling of such wastewater treatment systems have always targeted either the biological processes (treatment quality target) as well as the various aspects of engineering (cost effective design and operation). The development of Activated Sludge Models (ASM) was an important evolution in the modelling of Conventional Activated Sludge (CAS) processes and their use is now very well established. However, although they were initially developed to describe CAS processes, they have simply been transferred and applied to MBR processes. Recent studies on MBR biological processes have reported several crucial specificities: medium to very high sludge retention times, high mixed liquor concentration, accumulation of soluble microbial products (SMP) rejected by the membrane filtration step, and high aeration rates for scouring purposes. These aspects raise the question as to what extent the ASM framework is applicable to MBR processes. Several studies highlighting some of the aforementioned issues are scattered through the literature. Hence, through a concise and structured overview of the past developments and current state-of-the-art in biological modelling of MBR, this review explores ASM-based modelling applied to MBR processes. The work aims to synthesize previous studies and differentiates between unmodified and modified applications of ASM to MBR. Particular emphasis is placed on influent fractionation, biokinetics, and soluble microbial products (SMPs)/exo-polymeric substances (EPS) modelling, and suggestions are put forward as to good modelling practice with regard to MBR modelling both for end-users and academia. A last section highlights shortcomings and future needs for improved biological modelling of MBR processes. (c) 2010 Elsevier Ltd. All rights reserved.

  16. A generic biogeochemical module for earth system models

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.

    2013-06-01

    Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.

  17. Recrystallization and Grain Growth Kinetics in Binary Alpha Titanium-Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Trump, Anna Marie

    Titanium alloys are used in a variety of important naval and aerospace applications and often undergo thermomechanical processing which leads to recrystallization and grain growth. Both of these processes have a significant impact on the mechanical properties of the material. Therefore, understanding the kinetics of these processes is crucial to being able to predict the final properties. Three alloys are studied with varying concentrations of aluminum which allows for the direct quantification of the effect of aluminum content on the kinetics of recrystallization and grain growth. Aluminum is the most common alpha stabilizing alloying element used in titanium alloys, however the effect of aluminum on these processes has not been previously studied. This work is also part of a larger Integrated Computational Materials Engineering (ICME) effort whose goal is to combine both computational and experimental efforts to develop computationally efficient models that predict materials microstructure and properties based on processing history. The static recrystallization kinetics are measured using an electron backscatter diffraction (EBSD) technique and a significant retardation in the kinetics is observed with increasing aluminum concentration. An analytical model is then used to capture these results and is able to successfully predict the effect of solute concentration on the time to 50% recrystallization. The model reveals that this solute effect is due to a combination of a decrease in grain boundary mobility and a decrease in driving force with increasing aluminum concentration. The effect of microstructural inhomogeneities is also experimentally quantified and the results are validated with a phase field model for recrystallization. These microstructural inhomogeneities explain the experimentally measured Avrami exponent, which is lower than the theoretical value calculated by the JMAK model. Similar to the effect seen in recrystallization, the addition of aluminum also significantly slows downs the grain growth kinetics. This is generally attributed to the solute drag effect due to segregation of solute atoms at the grain boundaries, however aluminum segregation is not observed in these alloys. The mechanism for this result is explained and is used to validate the prediction of an existing model for solute drag.

  18. Recent Developments in Toxico-Cheminformatics: A New ...

    EPA Pesticide Factsheets

    Efforts to improve public access to chemical toxicity information resources, coupled with new high-throughput screening (HTS) data and efforts to systematize legacy toxicity studies, have the potential to significantly improve predictive capabilities in toxicology. Important recent developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. Most recently, EPA’s DSSTox project has published several new EPA chemical data inventories (IRIS, HPV, ToxCast) and added an on-line capability for structure (substructure or similarity)-searching through all or parts of the published DSSTox data files. These efforts are, for the first time in many cases, opening up a structure-paved two-way highway between previously inaccessible or isolated public chemical data repositories and large public resources, such as PubChem. In addition, public initiatives (such as ToxML) are developing systematized data models of toxicity study areas, and introducing standardized templates, contr

  19. Review of simulation techniques for Aquifer Thermal Energy Storage (ATES)

    NASA Astrophysics Data System (ADS)

    Mercer, J. W.; Faust, C. R.; Miller, W. J.; Pearson, F. J., Jr.

    1981-03-01

    The analysis of aquifer thermal energy storage (ATES) systems rely on the results from mathematical and geochemical models. Therefore, the state-of-the-art models relevant to ATES were reviewed and evaluated. These models describe important processes active in ATES including ground-water flow, heat transport (heat flow), solute transport (movement of contaminants), and geochemical reactions. In general, available models of the saturated ground-water environment are adequate to address most concerns associated with ATES; that is, design, operation, and environmental assessment. In those cases where models are not adequate, development should be preceded by efforts to identify significant physical phenomena and relate model parameters to measurable quantities.

  20. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

Top