40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2013 CFR
2013-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2012 CFR
2012-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.
van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L
2016-04-01
Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.
Theoretical models for application in school health education research.
Parcel, G S
1984-01-01
Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
Application of triggered lightning numerical models to the F106B and extension to other aircraft
NASA Technical Reports Server (NTRS)
Ng, Poh H.; Dalke, Roger A.; Horembala, Jim; Rudolph, Terence; Perala, Rodney A.
1988-01-01
The goal of the F106B Thunderstorm Research Program is to characterize the lightning environment for aircraft in flight. This report describes the application of numerical electromagnetic models to this problem. Topics include: (1) Extensive application of linear triggered lightning to F106B data; (2) Electrostatic analysis of F106B field mill data; (3) Application of subgrid modeling to F106B nose region, including both static and nonlinear models; (4) Extension of F106B results to other aircraft of varying sizes and shapes; and (5) Application of nonlinear model to interaction of F106B with lightning leader-return stroke event.
Review of Development Survey of Phase Change Material Models in Building Applications
Akeiber, Hussein J.; Wahid, Mazlan A.; Hussen, Hasanen M.; Mohammad, Abdulrahman Th.
2014-01-01
The application of phase change materials (PCMs) in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data. PMID:25313367
The Model Life-cycle: Training Module
Model Life-Cycle includes identification of problems & the subsequent development, evaluation, & application of the model. Objectives: define ‘model life-cycle’, explore stages of model life-cycle, & strategies for development, evaluation, & applications.
nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.
Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia
2017-12-01
Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Andrews, Dee H.; Dineen, Toni; Bell, Herbert H.
1999-01-01
Discusses the use of constructive modeling and virtual simulation in team training; describes a military application of constructive modeling, including technology issues and communication protocols; considers possible improvements; and discusses applications in team-learning environments other than military, including industry and education. (LRW)
Photochemical Modeling Applications
Provides access to modeling applications involving photochemical models, including modeling of ozone, particulate matter (PM), and mercury for national and regional EPA regulations such as the Clean Air Interstate Rule (CAIR) and the Clean Air Mercury Rule
Remote sensing applications to hydrologic modeling
NASA Technical Reports Server (NTRS)
Dozier, J.; Estes, J. E.; Simonett, D. S.; Davis, R.; Frew, J.; Marks, D.; Schiffman, K.; Souza, M.; Witebsky, E.
1977-01-01
An energy balance snowmelt model for rugged terrain was devised and coupled to a flow model. A literature review of remote sensing applications to hydrologic modeling was included along with a software development outline.
Atmospheric, climatic and environmental research
NASA Technical Reports Server (NTRS)
Broecker, Wallace S.; Gornitz, Vivien M.
1992-01-01
Work performed on the three tasks during the report period is summarized. The climate and atmospheric modeling studies included work on climate model development and applications, paleoclimate studies, climate change applications, and SAGE II. Climate applications of Earth and planetary observations included studies on cloud climatology and planetary studies. Studies on the chemistry of the Earth and the environment are briefly described. Publications based on the above research are listed; two of these papers are included in the appendices.
Examination of various turbulence models for application in liquid rocket thrust chambers
NASA Technical Reports Server (NTRS)
Hung, R. J.
1991-01-01
There is a large variety of turbulence models available. These models include direct numerical simulation, large eddy simulation, Reynolds stress/flux model, zero equation model, one equation model, two equation k-epsilon model, multiple-scale model, etc. Each turbulence model contains different physical assumptions and requirements. The natures of turbulence are randomness, irregularity, diffusivity and dissipation. The capabilities of the turbulence models, including physical strength, weakness, limitations, as well as numerical and computational considerations, are reviewed. Recommendations are made for the potential application of a turbulence model in thrust chamber and performance prediction programs. The full Reynolds stress model is recommended. In a workshop, specifically called for the assessment of turbulence models for applications in liquid rocket thrust chambers, most of the experts present were also in favor of the recommendation of the Reynolds stress model.
Computer-Aided Geometry Modeling
NASA Technical Reports Server (NTRS)
Shoosmith, J. N. (Compiler); Fulton, R. E. (Compiler)
1984-01-01
Techniques in computer-aided geometry modeling and their application are addressed. Mathematical modeling, solid geometry models, management of geometric data, development of geometry standards, and interactive and graphic procedures are discussed. The applications include aeronautical and aerospace structures design, fluid flow modeling, and gas turbine design.
The Sheperd equation and chaos identification.
Gregson, Robert A M
2010-04-01
An equation created by Sheperd (1982) to model stability in exploited fish populations has been found to have a wider application, and it exhibits complicated internal dynamics, including phases of strict periodicity and of chaos. It may be potentially applicable to other psychophysiological contexts. The problems of determining goodness-of fit, and the comparative performance of alternative models including the Shephed model, are briefly addressed.
NASA Astrophysics Data System (ADS)
Lezon, Timothy R.; Shrivastava, Indira H.; Yang, Zheng; Bahar, Ivet
The following sections are included: * Introduction * Theory and Assumptions * Statistical mechanical foundations * Anisotropic network models * Gaussian network model * Rigid block models * Treatment of perturbations * Langevin dynamics * Applications * Membrane proteins * Viruses * Conclusion * References
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
Macro Level Simulation Model Of Space Shuttle Processing
NASA Technical Reports Server (NTRS)
2000-01-01
The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.
DOT National Transportation Integrated Search
1974-08-01
Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...
Web-based application for inverting one-dimensional magnetotelluric data using Python
NASA Astrophysics Data System (ADS)
Suryanto, Wiwit; Irnaka, Theodosius Marwan
2016-11-01
One-dimensional modeling of magnetotelluric (MT) data has been performed using an online application on a web-based virtual private server. The application was developed with the Python language using the Django framework with HTML and CSS components. The input data, including the apparent resistivity and phase as a function of period or frequency with standard deviation, can be entered through an interactive web page that can be freely accessed at https://komputasi.geofisika.ugm.ac.id. The subsurface models, represented by resistivity as a function of depth, are iteratively improved by changing the model parameters, such as the resistivity and the layer depth, based on the observed apparent resistivity and phase data. The output of the application displayed on the screen presents resistivity as a function of depth and includes the RMS error for each iteration. Synthetic and real data were used in comparative tests of the application's performance, and it is shown that the application developed accurate subsurface resistivity models. Hence, this application can be used for practical one-dimensional modeling of MT data.
NASA Technical Reports Server (NTRS)
Lee, H. P.
1977-01-01
The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.
Agent-based modeling: Methods and techniques for simulating human systems
Bonabeau, Eric
2002-01-01
Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407
Kotrmaneetaweetong, Unchana; Choopen, Hhakuan; Chowchuen, Bowornsilp
2012-11-01
The objectives of the present study are 1) to study the application of sufficiency economy philosophy in community development as a model for future application of community health care program of Tawanchai Center, 2) study the administrative model for self sufficiency economy community in Bankhambong Community, Sa-ard Sub-district, Nampong District, Khon Kaen Province. The integrated study model included qualitative research by collecting data from documents, textbook, article, report, theory concept, researches and interviewing of relevant persons and the quantitative research by collecting data from questionnaires. The findings of study included objectives for development model of sufficiency economy for understanding of people, and use the philosophy of sufficiency economy model which compose of decrease expenditure, increase income activities, saving activities, learning activities and preservation of environment and sustainable natural resources activities. Decrease in expenditure activities included household gardening, and no allurements leading to ruin. Increase in income activities included supplement occupation and appropriate use of technology. Saving activities included creating saving group in household and community level. Learning activities included community use of local wisdom, and household learnt philosophy of sufficiency economy in daily living. Preservation of environment and sustainable natural resources activities included the use of sustainable raw materials in occupation. The generosity of one another activities included helping each other and solving problems for the poor and disable persons. The community development at in Bankhambong Community, Sa-ard Sub-district, Nampong District, Khon Kaen Province followed all of the above scope and guidelines and is the model for application of sufficiency community philosophy. We recommended method for successful implementation, including the starting from group process with capability of learning to create strong and adequate knowledge to apply sufficiency economy model and cover health care.
Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.
2006-01-01
he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.
Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling
NASA Astrophysics Data System (ADS)
Ormsbee, L.; Tufail, M.
2005-12-01
The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.
NASA Technical Reports Server (NTRS)
Cashion, Kenneth D.; Whitehurst, Charles A.
1987-01-01
The activities of the Earth Resources Laboratoy (ERL) for the past seventeen years are reviewed with particular reference to four typical applications demonstrating the use of remotely sensed data in a geobased information system context. The applications discussed are: a fire control model for the Olympic National Park; wildlife habitat modeling; a resource inventory system including a potential soil erosion model; and a corridor analysis model for locating routes between geographical locations. Some future applications are also discussed.
Space shuttle propulsion estimation development verification
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The application of extended Kalman filtering to estimating the Space Shuttle Propulsion performance, i.e., specific impulse, from flight data in a post-flight processing computer program is detailed. The flight data used include inertial platform acceleration, SRB head pressure, SSME chamber pressure and flow rates, and ground based radar tracking data. The key feature in this application is the model used for the SRB's, which is a nominal or reference quasi-static internal ballistics model normalized to the propellant burn depth. Dynamic states of mass overboard and propellant burn depth are included in the filter model to account for real-time deviations from the reference model used. Aerodynamic, plume, wind and main engine uncertainties are also included for an integrated system model. Assuming uncertainty within the propulsion system model and attempts to estimate its deviations represent a new application of parameter estimation for rocket powered vehicles. Illustrations from the results of applying this estimation approach to several missions show good quality propulsion estimates.
Polese, Pierluigi; Torre, Manuela Del; Stecchini, Mara Lucia
2018-03-31
The use of predictive modelling tools, which mainly describe the response of microorganisms to a particular set of environmental conditions, may contribute to a better understanding of microbial behaviour in foods. In this paper, a tertiary model, in the form of a readily available and userfriendly web-based application Praedicere Possumus (PP) is presented with research examples from our laboratories. Through the PP application, users have access to different modules, which apply a set of published models considered reliable for determining the compliance of a food product with EU safety criteria and for optimising processing throughout the identification of critical control points. The application pivots around a growth/no-growth boundary model, coupled with a growth model, and includes thermal and non-thermal inactivation models. Integrated functionalities, such as the fractional contribution of each inhibitory factor to growth probability (f) and the time evolution of the growth probability (P t ), have also been included. The PP application is expected to assist food industry and food safety authorities in their common commitment towards the improvement of food safety.
Population balance modeling: current status and future prospects.
Ramkrishna, Doraiswami; Singh, Meenesh R
2014-01-01
Population balance modeling is undergoing phenomenal growth in its applications, and this growth is accompanied by multifarious reviews. This review aims to fortify the model's fundamental base, as well as point to a variety of new applications, including modeling of crystal morphology, cell growth and differentiation, gene regulatory processes, and transfer of drug resistance. This is accomplished by presenting the many faces of population balance equations that arise in the foregoing applications.
Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.
Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew
2017-09-01
Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.
Solar-terrestrial models and application software
NASA Technical Reports Server (NTRS)
Bilitza, Dieter
1990-01-01
The empirical models related to solar-terrestrial sciences are listed and described which are available in the form of computer programs. Also included are programs that use one or more of these models for application specific purposes. The entries are grouped according to the region of the solar-terrestrial environment to which they belong and according to the parameter which they describe. Regions considered include the ionosphere, atmosphere, magnetosphere, planets, interplanetary space, and heliosphere. Also provided is the information on the accessibility for solar-terrestrial models to specify the magnetic and solar activity conditions.
Land Surface Process and Air Quality Research and Applications at MSFC
NASA Technical Reports Server (NTRS)
Quattrochi, Dale; Khan, Maudood
2007-01-01
This viewgraph presentation provides an overview of land surface process and air quality research at MSFC including atmospheric modeling and ongoing research whose objective is to undertake a comprehensive spatiotemporal analysis of the effects of accurate land surface characterization on atmospheric modeling results, and public health applications. Land use maps as well as 10 meter air temperature, surface wind, PBL mean difference heights, NOx, ozone, and O3+NO2 plots as well as spatial growth model outputs are included. Emissions and general air quality modeling are also discussed.
Muffly, Tyler M; Barber, Matthew D; Karafa, Matthew T; Kattan, Michael W; Shniter, Abigail; Jelovsek, J Eric
2012-01-01
The purpose of the study was to develop a model that predicts an individual applicant's probability of successful placement into a surgical subspecialty fellowship program. Candidates who applied to surgical fellowships during a 3-year period were identified in a set of databases that included the electronic application materials. Of the 1281 applicants who were available for analysis, 951 applicants (74%) successfully placed into a colon and rectal surgery, thoracic surgery, vascular surgery, or pediatric surgery fellowship. The optimal final prediction model, which was based on a logistic regression, included 14 variables. This model, with a c statistic of 0.74, allowed for the determination of a useful estimate of the probability of placement for an individual candidate. Of the factors that are available at the time of fellowship application, 14 were used to predict accurately the proportion of applicants who will successfully gain a fellowship position. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nijssen, B.; Hamman, J.; Bohn, T. J.
2015-12-01
The Variable Infiltration Capacity (VIC) model is a macro-scale semi-distributed hydrologic model. VIC development began in the early 1990s and it has been used extensively, applied from basin to global scales. VIC has been applied in a many use cases, including the construction of hydrologic data sets, trend analysis, data evaluation and assimilation, forecasting, coupled climate modeling, and climate change impact analysis. Ongoing applications of the VIC model include the University of Washington's drought monitor and forecast systems, and NASA's land data assimilation systems. The development of VIC version 5.0 focused on reconfiguring the legacy VIC source code to support a wider range of modern modeling applications. The VIC source code has been moved to a public Github repository to encourage participation by the model development community-at-large. The reconfiguration has separated the physical core of the model from the driver, which is responsible for memory allocation, pre- and post-processing and I/O. VIC 5.0 includes four drivers that use the same physical model core: classic, image, CESM, and Python. The classic driver supports legacy VIC configurations and runs in the traditional time-before-space configuration. The image driver includes a space-before-time configuration, netCDF I/O, and uses MPI for parallel processing. This configuration facilitates the direct coupling of streamflow routing, reservoir, and irrigation processes within VIC. The image driver is the foundation of the CESM driver; which couples VIC to CESM's CPL7 and a prognostic atmosphere. Finally, we have added a Python driver that provides access to the functions and datatypes of VIC's physical core from a Python interface. This presentation demonstrates how reconfiguring legacy source code extends the life and applicability of a research model.
An Open Simulation System Model for Scientific Applications
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1995-01-01
A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.
Evolving PBPK applications in regulatory risk assessment: current situation and future goals
The presentation includes current applications of PBPK modeling in regulatory risk assessment and discussions on conflicts between assuring consistency with experimental data in current situation and the desire for animal-free model development.
Applications manual for logit modes of express bus-fringe parking choices.
DOT National Transportation Integrated Search
1976-01-01
Manual computations and computerized applications of logit models are described. The models demonstrated reflect travel behavior concerning express bus-fringe parking transit. The specific travel issues addressed include the basic automobile vs. expr...
Urban development applications project. Urban technology transfer study
NASA Technical Reports Server (NTRS)
1975-01-01
Technology transfer is defined along with reasons for attempting to transfer technology. Topics discussed include theoretical models, stages of the innovation model, communication process model, behavior of industrial organizations, problem identification, technology search and match, establishment of a market mechanism, applications engineering, commercialization, and management of technology transfer.
76 FR 53137 - Bundled Payments for Care Improvement Initiative: Request for Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-25
... (RFA) will test episode-based payment for acute care and associated post-acute care, using both retrospective and prospective bundled payment methods. The RFA requests applications to test models centered around acute care; these models will inform the design of future models, including care improvement for...
gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.
Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil
2018-04-01
Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.
NASA Astrophysics Data System (ADS)
Gosselin, Marie-Christine; Neufeld, Esra; Moser, Heidi; Huber, Eveline; Farcito, Silvia; Gerber, Livia; Jedensjö, Maria; Hilber, Isabel; Di Gennaro, Fabienne; Lloyd, Bryn; Cherubini, Emilio; Szczerba, Dominik; Kainz, Wolfgang; Kuster, Niels
2014-09-01
The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
Videogrammetric Model Deformation Measurement Technique
NASA Technical Reports Server (NTRS)
Burner, A. W.; Liu, Tian-Shu
2001-01-01
The theory, methods, and applications of the videogrammetric model deformation (VMD) measurement technique used at NASA for wind tunnel testing are presented. The VMD technique, based on non-topographic photogrammetry, can determine static and dynamic aeroelastic deformation and attitude of a wind-tunnel model. Hardware of the system includes a video-rate CCD camera, a computer with an image acquisition frame grabber board, illumination lights, and retroreflective or painted targets on a wind tunnel model. Custom software includes routines for image acquisition, target-tracking/identification, target centroid calculation, camera calibration, and deformation calculations. Applications of the VMD technique at five large NASA wind tunnels are discussed.
Puget Sound Applications of the VELMA Ecohydrological Model
This seminar will present an overview of EPA’s Visualizing Ecosystem Land Management Assessments (VELMA) model and its applications in the Puget Sound Basin. Topics will include a description of how VELMA simulates the interaction of hydrological and biogeochemical processe...
New V and V Tools for Diagnostic Modeling Environment (DME)
NASA Technical Reports Server (NTRS)
Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)
2002-01-01
The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.
2014-09-30
from individuals to the population by way of changes in either behavior or physiology, and the revised approach is called PCOD (Population...include modeling fecundity, and exploring the feasibility of incorporating acoustic disturbance and prey variability into the PCOD model...the applicability of the model to assessing the effects of acoustics on the population. We have refined and applied the PCOD model developed for
Application of a Phase-resolving, Directional Nonlinear Spectral Wave Model
NASA Astrophysics Data System (ADS)
Davis, J. R.; Sheremet, A.; Tian, M.; Hanson, J. L.
2014-12-01
We describe several applications of a phase-resolving, directional nonlinear spectral wave model. The model describes a 2D surface gravity wave field approaching a mildly sloping beach with parallel depth contours at an arbitrary angle accounting for nonlinear, quadratic triad interactions. The model is hyperbolic, with the initial wave spectrum specified in deep water. Complex amplitudes are generated based on the random phase approximation. The numerical implementation includes unidirectional propagation as a special case. In directional mode, it solves the system of equations in the frequency-alongshore wave number space. Recent enhancements of the model include the incorporation of dissipation caused by breaking and propagation over a viscous mud layer and the calculation of wave induced setup. Applications presented include: a JONSWAP spectrum with a cos2s directional distribution, for shore-perpendicular and oblique propagation, a study of the evolution of a single directional triad, and several preliminary comparisons to wave spectra collected at the USACE-FRF in Duck, NC which show encouraging results although further validation with a wider range of beach slopes and wave conditions is needed.
The Power Prior: Theory and Applications
Ibrahim, Joseph G.; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang
2015-01-01
The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A to Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Prequentist properties of power priors in posterior inference are established and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. PMID:26346180
40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...
40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.
Code of Federal Regulations, 2012 CFR
2012-07-01
....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...
Testing woody fuel consumption models for application in Australian southern eucalypt forest fires
J.J. Hollis; S. Matthews; Roger Ottmar; S.J. Prichard; S. Slijepcevic; N.D. Burrows; B. Ward; K.G. Tolhurst; W.R. Anderson; J S. Gould
2010-01-01
Five models for the consumption of coarse woody debris or woody fuels with a diameter larger than 0.6 cm were assessed for application in Australian southern eucalypt forest fires including: CONSUME models for (1) activity fuels, (2) natural western woody and (3) natural southern woody fuels, (4) the BURNUP model and (5) the recommendation by the Australian National...
ERIC Educational Resources Information Center
Dumais, Susan T.
2004-01-01
Presents a literature review that covers the following topics related to Latent Semantic Analysis (LSA): (1) LSA overview; (2) applications of LSA, including information retrieval (IR), information filtering, cross-language retrieval, and other IR-related LSA applications; (3) modeling human memory, including the relationship of LSA to other…
Addressing HIV in the School Setting: Application of a School Change Model
ERIC Educational Resources Information Center
Walsh, Audra St. John; Chenneville, Tiffany
2013-01-01
This paper describes best practices for responding to youth with human immunodeficiency virus (HIV) in the school setting through the application of a school change model designed by the World Health Organization. This model applies a whole school approach and includes four levels that span the continuum from universal prevention to direct…
Skill Acquisition in Ski Instruction and the Skill Model's Application to Treating Anorexia Nervosa
ERIC Educational Resources Information Center
Duesund, Liv; Jespersen, Ejgil
2004-01-01
The Dreyfus skill model has a wide range of applications to various domains, including sport, nursing, engineering, flying, and so forth. In this article, the authors discuss the skill model in connection with two different research projects concerning ski instruction and treating anorexia nervosa. The latter project has been published but not in…
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Preliminary Model of Porphyry Copper Deposits
Berger, Byron R.; Ayuso, Robert A.; Wynn, Jeffrey C.; Seal, Robert R.
2008-01-01
The U.S. Geological Survey (USGS) Mineral Resources Program develops mineral-deposit models for application in USGS mineral-resource assessments and other mineral resource-related activities within the USGS as well as for nongovernmental applications. Periodic updates of models are published in order to incorporate new concepts and findings on the occurrence, nature, and origin of specific mineral deposit types. This update is a preliminary model of porphyry copper deposits that begins an update process of porphyry copper models published in USGS Bulletin 1693 in 1986. This update includes a greater variety of deposit attributes than were included in the 1986 model as well as more information about each attribute. It also includes an expanded discussion of geophysical and remote sensing attributes and tools useful in resource evaluations, a summary of current theoretical concepts of porphyry copper deposit genesis, and a summary of the environmental attributes of unmined and mined deposits.
Applications of bioenergetics models to fish ecology and management: where do we go from here?
Hansen, Michael J.; Boisclair, Daniel; Brandt, Stephen B.; Hewett, Steven W.; Kitchell, James F.; Lucas, Martyn C.; Ney, John J.
1993-01-01
Papers and panel discussions given during a 1992 symposium on bioenergetics models are summarized. Bioenergetics models have been applied to a variety of research and management questions related to fish stocks, populations, food webs, and ecosystems. Applications include estimates of the intensity and dynamics of predator-prey interactions, nutrient cycling within aquatic food webs of varying trophic structure, and food requirements of single animals, whole populations, and communities of fishes. As tools in food web and ecosystem applications, bioenergetics models have been used to compare forage consumption by salmonid predators across the Laurentian Great Lakes for single populations and whole communities, and to estimate the growth potential of pelagic predators in Chesapeake Bay and Lake Ontario. Some critics say that bioenergetics models lack sufficient detail to produce reliable results in such field applications, whereas others say that the models are too complex to be useful tools for fishery managers. Nevertheless, bioenergetics models have achieved notable predictive successes. Improved estimates are needed for model parameters such as metabolic costs of activity, and more complete studies are needed of the bioenergetics of larval and juvenile fishes. Future research on bioenergetics should include laboratory and field measurements of key model parameters such as weight-dependent maximum consumption, respiration and activity, and thermal habitats actually occupied by fish. Future applications of bioenergetics models to fish populations also depend on accurate estimates of population sizes and survival rates.
CAD-model-based vision for space applications
NASA Technical Reports Server (NTRS)
Shapiro, Linda G.
1988-01-01
A pose acquisition system operating in space must be able to perform well in a variety of different applications including automated guidance and inspections tasks with many different, but known objects. Since the space station is being designed with automation in mind, there will be CAD models of all the objects, including the station itself. The construction of vision models and procedures directly from the CAD models is the goal of this project. The system that is being designed and implementing must convert CAD models to vision models, predict visible features from a given view point from the vision models, construct view classes representing views of the objects, and use the view class model thus derived to rapidly determine the pose of the object from single images and/or stereo pairs.
NASA Technical Reports Server (NTRS)
Bradshaw, James F.; Sandefur, Paul G., Jr.; Young, Clarence P., Jr.
1991-01-01
A comprehensive study of braze alloy selection process and strength characterization with application to wind tunnel models is presented. The applications for this study include the installation of stainless steel pressure tubing in model airfoil sections make of 18 Ni 200 grade maraging steel and the joining of wing structural components by brazing. Acceptable braze alloys for these applications are identified along with process, thermal braze cycle data, and thermal management procedures. Shear specimens are used to evaluate comparative shear strength properties for the various alloys at both room and cryogenic (-300 F) temperatures and include the effects of electroless nickel plating. Nickel plating was found to significantly enhance both the wetability and strength properties for the various braze alloys studied. The data are provided for use in selecting braze alloys for use with 18 Ni grade 200 steel in the design of wind tunnel models to be tested in an ambient or cryogenic environment.
Assessment of municipal solid waste settlement models based on field-scale data analysis.
Bareither, Christopher A; Kwak, Seungbok
2015-08-01
An evaluation of municipal solid waste (MSW) settlement model performance and applicability was conducted based on analysis of two field-scale datasets: (1) Yolo and (2) Deer Track Bioreactor Experiment (DTBE). Twelve MSW settlement models were considered that included a range of compression behavior (i.e., immediate compression, mechanical creep, and biocompression) and range of total (2-22) and optimized (2-7) model parameters. A multi-layer immediate settlement analysis developed for Yolo provides a framework to estimate initial waste thickness and waste thickness at the end-of-immediate compression. Model application to the Yolo test cells (conventional and bioreactor landfills) via least squares optimization yielded high coefficient of determinations for all settlement models (R(2)>0.83). However, empirical models (i.e., power creep, logarithmic, and hyperbolic models) are not recommended for use in MSW settlement modeling due to potential non-representative long-term MSW behavior, limited physical significance of model parameters, and required settlement data for model parameterization. Settlement models that combine mechanical creep and biocompression into a single mathematical function constrain time-dependent settlement to a single process with finite magnitude, which limits model applicability. Overall, all models evaluated that couple multiple compression processes (immediate, creep, and biocompression) provided accurate representations of both Yolo and DTBE datasets. A model presented in Gourc et al. (2010) included the lowest number of total and optimized model parameters and yielded high statistical performance for all model applications (R(2)⩾0.97). Copyright © 2015 Elsevier Ltd. All rights reserved.
A model of cloud application assignments in software-defined storages
NASA Astrophysics Data System (ADS)
Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander
2017-01-01
The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.
NASA Technical Reports Server (NTRS)
Canfield, Stephen
1999-01-01
This work will demonstrate the integration of sensor and system dynamic data and their appropriate models using an optimal filter to create a robust, adaptable, easily reconfigurable state (motion) estimation system. This state estimation system will clearly show the application of fundamental modeling and filtering techniques. These techniques are presented at a general, first principles level, that can easily be adapted to specific applications. An example of such an application is demonstrated through the development of an integrated GPS/INS navigation system. This system acquires both global position data and inertial body data, to provide optimal estimates of current position and attitude states. The optimal states are estimated using a Kalman filter. The state estimation system will include appropriate error models for the measurement hardware. The results of this work will lead to the development of a "black-box" state estimation system that supplies current motion information (position and attitude states) that can be used to carry out guidance and control strategies. This black-box state estimation system is developed independent of the vehicle dynamics and therefore is directly applicable to a variety of vehicles. Issues in system modeling and application of Kalman filtering techniques are investigated and presented. These issues include linearized models of equations of state, models of the measurement sensors, and appropriate application and parameter setting (tuning) of the Kalman filter. The general model and subsequent algorithm is developed in Matlab for numerical testing. The results of this system are demonstrated through application to data from the X-33 Michael's 9A8 mission and are presented in plots and simple animations.
Viger, Roland J.; Hay, Lauren E.; Jones, John W.; Buell, Gary R.
2010-01-01
This report documents an extension of the Precipitation Runoff Modeling System that accounts for the effect of a large number of water-holding depressions in the land surface on the hydrologic response of a basin. Several techniques for developing the inputs needed by this extension also are presented. These techniques include the delineation of the surface depressions, the generation of volume estimates for the surface depressions, and the derivation of model parameters required to describe these surface depressions. This extension is valuable for applications in basins where surface depressions are too small or numerous to conveniently model as discrete spatial units, but where the aggregated storage capacity of these units is large enough to have a substantial effect on streamflow. In addition, this report documents several new model concepts that were evaluated in conjunction with the depression storage functionality, including: ?hydrologically effective? imperviousness, rates of hydraulic conductivity, and daily streamflow routing. All of these techniques are demonstrated as part of an application in the Upper Flint River Basin, Georgia. Simulated solar radiation, potential evapotranspiration, and water balances match observations well, with small errors for the first two simulated data in June and August because of differences in temperatures from the calibration and evaluation periods for those months. Daily runoff simulations show increasing accuracy with streamflow and a good fit overall. Including surface depression storage in the model has the effect of decreasing daily streamflow for all but the lowest flow values. The report discusses the choices and resultant effects involved in delineating and parameterizing these features. The remaining enhancements to the model and its application provide a more realistic description of basin geography and hydrology that serve to constrain the calibration process to more physically realistic parameter values.
Applicator modeling for electromagnetic thermotherapy of cervix cancer.
Rezaeealam, Behrooz
2015-03-01
This report proposes an induction heating coil design that can be used for producing strong magnetic fields around ferromagnetic implants located in the cervix of uterus. The effect of coil design on the uniformity and extent of heat generation ability is inspected. Also, a numerical model of the applicator is developed that includes the ferromagnetic implants, and is coupled to the bioheat transfer model of the body tissue. Then, the ability of the proposed applicator for electromagnetic thermotherapy is investigated.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., including vehicle simulations using industry standard model (need to add name and location of this open source model) to show projected fuel economy; (d) A detailed estimate of the total project costs together..., equity, and debt, and the liability of parties associated with the project; (f) Applicant's business plan...
Model Data Interoperability for the United States Integrated Ocean Observing System (IOOS)
NASA Astrophysics Data System (ADS)
Signell, Richard P.
2010-05-01
Model data interoperability for the United States Integrated Ocean Observing System (IOOS) was initiated with a focused one year project. The problem was that there were many regional and national providers of oceanographic model data; each had unique file conventions, distribution techniques and analysis tools that made it difficult to compare model results and observational data. To solve this problem, a distributed system was built utilizing a customized middleware layer and a common data model. This allowed each model data provider to keep their existing model and data files unchanged, yet deliver model data via web services in a common form. With standards-based applications that used these web services, end users then had a common way to access data from any of the models. These applications included: (1) a 2D mapping and animation using a web browser application, (2) an advanced 3D visualization and animation using a desktop application, and (3) a toolkit for a common scientific analysis environment. Due to the flexibility and low impact of the approach on providers, rapid progress was made. The system was implemented in all eleven US IOOS regions and at the NOAA National Coastal Data Development Center, allowing common delivery of regional and national oceanographic model forecast and archived results that cover all US waters. The system, based heavily on software technology from the NSF-sponsored Unidata Program Center, is applicable to any structured gridded data, not just oceanographic model data. There is a clear pathway to expand the system to include unstructured grid (e.g. triangular grid) data.
Videogrammetric Model Deformation Measurement Technique for Wind Tunnel Applications
NASA Technical Reports Server (NTRS)
Barrows, Danny A.
2006-01-01
Videogrammetric measurement technique developments at NASA Langley were driven largely by the need to quantify model deformation at the National Transonic Facility (NTF). This paper summarizes recent wind tunnel applications and issues at the NTF and other NASA Langley facilities including the Transonic Dynamics Tunnel, 31-Inch Mach 10 Tunnel, 8-Ft high Temperature Tunnel, and the 20-Ft Vertical Spin Tunnel. In addition, several adaptations of wind tunnel techniques to non-wind tunnel applications are summarized. These applications include wing deformation measurements on vehicles in flight, determining aerodynamic loads based on optical elastic deformation measurements, measurements on ultra-lightweight and inflatable space structures, and the use of an object-to-image plane scaling technique to support NASA s Space Exploration program.
Cognitive engineering models in space systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1993-01-01
NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. Models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision aids. Currently, there several candidate modeling methodologies. They include the Rasmussen abstraction/aggregation hierarchy and decision ladder, the goal-means network, the problem behavior graph, and the operator function model. The research conducted under the sponsorship of this grant focuses on the extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications. The initial portion of this research consists of two parts. The first is a series of technical exchanges between NASA Johnson and Georgia Tech researchers. The purpose is to identify candidate applications for the current operator function model; prospects include mission operations and the Data Management System Testbed. The second portion will address extensions of the operator function model to tailor it to the specific needs of Johnson applications. At this point, we have accomplished two things. During a series of conversations with JSC researchers, we have defined the technical goal of the research supported by this grant to be the structural definition of the operator function model and its computer implementation, OFMspert. Both the OFM and OFMspert have matured to the point that they require infrastructure to facilitate use by researchers not involved in the evolution of the tools. The second accomplishment this year was the identification of the Payload Deployment and Retrieval System (PDRS) as a candidate system for the case study. In conjunction with government and contractor personnel in the Human-Computer Interaction Lab, the PDRS was identified as the most accessible system for the demonstration. Pursuant to this a PDRS simulation was obtained from the HCIL and an initial knowledge engineering effort was conducted to understand the operator's tasks in the PDRS application. The preliminary results of the knowledge engineering effort and an initial formulation of an operator function model (OFM) are contained in the appendices.
PERIODIC AUTOREGRESSIVE-MOVING AVERAGE (PARMA) MODELING WITH APPLICATIONS TO WATER RESOURCES.
Vecchia, A.V.
1985-01-01
Results involving correlation properties and parameter estimation for autogressive-moving average models with periodic parameters are presented. A multivariate representation of the PARMA model is used to derive parameter space restrictions and difference equations for the periodic autocorrelations. Close approximation to the likelihood function for Gaussian PARMA processes results in efficient maximum-likelihood estimation procedures. Terms in the Fourier expansion of the parameters are sequentially included, and a selection criterion is given for determining the optimal number of harmonics to be included. Application of the techniques is demonstrated through analysis of a monthly streamflow time series.
Mars Global Reference Atmospheric Model 2010 Version: Users Guide
NASA Technical Reports Server (NTRS)
Justh, H. L.
2014-01-01
This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.
The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...
The Female Voice: Applications to Bowen's Family Systems Theory.
ERIC Educational Resources Information Center
Knudson-Martin, Carmen
1994-01-01
Responds to calls from feminist scholars to address potential biases against women in theories of family therapy. Summarizes findings from studies of female development and integrates findings into expanded model of Bowen's family systems theory. Includes case example comparing expanded model with traditional application of Bowen's theory.…
40 CFR 209.9 - Contents of a remedial plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
... by the remedial order, including the category and/or configuration if applicable, and the make, model year and model number, if applicable. (2) A detailed description of the present location of the... the respondent. (5) A detailed account of the costs of implementing each of the proposed plans. (b...
Demonstration of the Capabilities of the KINEROS2 – AGWA 3.0 Suite of Modeling Tools
This poster and computer demonstration illustrates a sampling of the wide range of applications that are possible using the KINEROS2 - AGWA suite of modeling tools. Applications include: 1) Incorporation of Low Impact Development (LID) features; 2) A real-time flash flood forecas...
Performance modeling codes for the QuakeSim problem solving environment
NASA Technical Reports Server (NTRS)
Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.
2003-01-01
The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.
Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.
2008-01-01
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The ECE application forecasts annual costs of preventive and corrective maintenance for budgeting purposes. Features within the application enable the user to change the specifications of the model to customize your forecast to best fit their needs and support “what if” analysis. Based on the user's selections, the ECE model forecasts annual maintenance costs. Preventive maintenance costs include the cost of labor to perform preventive maintenance activities at the specific frequency and labor rate. Corrective maintenance costs include the cost of labor and the cost of replacement parts. The application presents forecasted maintenance costs for the next five years inmore » two tables: costs by year and costs by site.« less
The Shuttle Radar Topography Mission: A Global DEM
NASA Technical Reports Server (NTRS)
Farr, Tom G.; Kobrick, Mike
2000-01-01
Digital topographic data are critical for a variety of civilian, commercial, and military applications. Scientists use Digital Elevation Models (DEM) to map drainage patterns and ecosystems, and to monitor land surface changes over time. The mountain-building effects of tectonics and the climatic effects of erosion can also be modeled with DEW The data's military applications include mission planning and rehearsal, modeling and simulation. Commercial applications include determining locations for cellular phone towers, enhanced ground proximity warning systems for aircraft, and improved maps for backpackers. The Shuttle Radar Topography Mission (SRTM) (Fig. 1), is a cooperative project between NASA and the National Imagery and Mapping Agency (NIMA) of the U.S. Department of Defense. The mission is designed to use a single-pass radar interferometer to produce a digital elevation model of the Earth's land surface between about 60 degrees north and south latitude. The DEM will have 30 m pixel spacing and about 15 m vertical errors.
Project JOVE. [microgravity experiments and applications
NASA Technical Reports Server (NTRS)
Lyell, M. J.
1994-01-01
The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.
Calibration of Complex Subsurface Reaction Models Using a Surrogate-Model Approach
Application of model assessment techniques to complex subsurface reaction models involves numerous difficulties, including non-trivial model selection, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study introduces SAMM (Simult...
30 CFR 7.303 - Application requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... APPROVAL OF MINING PRODUCTS TESTING BY APPLICANT OR THIRD PARTY Electric Motor Assemblies § 7.303 Application requirements. (a) An application for approval of a motor assembly shall include a composite drawing or drawings with the following information: (1) Model (type), frame size, and rating of the motor...
Xie, Tianwu; Zaidi, Habib
2016-01-01
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.
Nonlinear Constitutive Relations for High Temperature Application, 1984
NASA Technical Reports Server (NTRS)
1985-01-01
Nonlinear constitutive relations for high temperature applications were discussed. The state of the art in nonlinear constitutive modeling of high temperature materials was reviewed and the need for future research and development efforts in this area was identified. Considerable research efforts are urgently needed in the development of nonlinear constitutive relations for high temperature applications prompted by recent advances in high temperature materials technology and new demands on material and component performance. Topics discussed include: constitutive modeling, numerical methods, material testing, and structural applications.
Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience
Hooper, R.P.
2001-01-01
A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.
McCardell, A; Davison, L; Edwards, A
2005-01-01
Designers of on-site wastewater management systems have six opportunities to remove pollutants of concern from the aqueous waste stream before it reaches ground or surface waters. These opportunities occur at source, at point of collection (primary treatment), secondary treatment, tertiary treatment, land application and buffers. This paper presents a computer based model for the sizing of on-site system land application areas applicable to the Lismore area in Northern New South Wales, a region of high rainfall. Inputs to the model include daily climatic data, soil type, number of people loading the system and size of housing allotment. Constraints include allowable phosphorus export, nitrogen export and hydraulic percolation. In the Lismore area nitrogen is the nutrient of most concern. In areas close to environmentally sensitive waterways, and in dense developments, the allowable annual nitrogen export becomes the main factor determining the land application area size. The model offers system designers the opportunity to test various combinations of nitrogen attenuation strategies (source control, secondary treatment) in order to create a solution which offers an acceptable nitrogen export rate while meeting the client's household and financial needs. The model runs on an Excel spreadsheet and has been developed by Lismore City Council.
Dynamic Analyses Including Joints Of Truss Structures
NASA Technical Reports Server (NTRS)
Belvin, W. Keith
1991-01-01
Method for mathematically modeling joints to assess influences of joints on dynamic response of truss structures developed in study. Only structures with low-frequency oscillations considered; only Coulomb friction and viscous damping included in analysis. Focus of effort to obtain finite-element mathematical models of joints exhibiting load-vs.-deflection behavior similar to measured load-vs.-deflection behavior of real joints. Experiments performed to determine stiffness and damping nonlinearities typical of joint hardware. Algorithm for computing coefficients of analytical joint models based on test data developed to enable study of linear and nonlinear effects of joints on global structural response. Besides intended application to large space structures, applications in nonaerospace community include ground-based antennas and earthquake-resistant steel-framed buildings.
An assessment and application of turbulence models for hypersonic flows
NASA Technical Reports Server (NTRS)
Coakley, T. J.; Viegas, J. R.; Huang, P. G.; Rubesin, M. W.
1990-01-01
The current approach to the Accurate Computation of Complex high-speed flows is to solve the Reynolds averaged Navier-Stokes equations using finite difference methods. An integral part of this approach consists of development and applications of mathematical turbulence models which are necessary in predicting the aerothermodynamic loads on the vehicle and the performance of the propulsion plant. Computations of several high speed turbulent flows using various turbulence models are described and the models are evaluated by comparing computations with the results of experimental measurements. The cases investigated include flows over insulated and cooled flat plates with Mach numbers ranging from 2 to 8 and wall temperature ratios ranging from 0.2 to 1.0. The turbulence models investigated include zero-equation, two-equation, and Reynolds-stress transport models.
ERIC Educational Resources Information Center
Wilde, Carroll O.
The Poisson probability distribution is seen to provide a mathematical model from which useful information can be obtained in practical applications. The distribution and some situations to which it applies are studied, and ways to find answers to practical questions are noted. The unit includes exercises and a model exam, and provides answers to…
The power prior: theory and applications.
Ibrahim, Joseph G; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang
2015-12-10
The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A-to-Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Frequentist properties of power priors in posterior inference are established, and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. Copyright © 2015 John Wiley & Sons, Ltd.
ME science as mobile learning based on virtual reality
NASA Astrophysics Data System (ADS)
Fradika, H. D.; Surjono, H. D.
2018-04-01
The purpose of this article described about ME Science (Mobile Education Science) as mobile learning application learning of Fisika Inti. ME Science is a product of research and development (R&D) that was using Alessi and Trollip model. Alessi and Trollip model consists three stages that are: (a) planning include analysis of problems, goals, need, and idea of development product, (b) designing includes collecting of materials, designing of material content, creating of story board, evaluating and review product, (c) developing includes development of product, alpha testing, revision of product, validation of product, beta testing, and evaluation of product. The article describes ME Science only to development of product which include development stages. The result of development product has been generates mobile learning application based on virtual reality that can be run on android-based smartphone. These application consist a brief description of learning material, quizzes, video of material summery, and learning material based on virtual reality.
Clinical application of the five-factor model.
Widiger, Thomas A; Presnall, Jennifer Ruth
2013-12-01
The Five-Factor Model (FFM) has become the predominant dimensional model of general personality structure. The purpose of this paper is to suggest a clinical application. A substantial body of research indicates that the personality disorders included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) can be understood as extreme and/or maladaptive variants of the FFM (the acronym "DSM" refers to any particular edition of the APA DSM). In addition, the current proposal for the forthcoming fifth edition of the DSM (i.e., DSM-5) is shifting closely toward an FFM dimensional trait model of personality disorder. Advantages of this shifting conceptualization are discussed, including treatment planning. © 2012 Wiley Periodicals, Inc.
Overview of the Meso-NH model version 5.4 and its applications
NASA Astrophysics Data System (ADS)
Lac, Christine; Chaboureau, Jean-Pierre; Masson, Valéry; Pinty, Jean-Pierre; Tulet, Pierre; Escobar, Juan; Leriche, Maud; Barthe, Christelle; Aouizerats, Benjamin; Augros, Clotilde; Aumond, Pierre; Auguste, Franck; Bechtold, Peter; Berthet, Sarah; Bielli, Soline; Bosseur, Frédéric; Caumont, Olivier; Cohard, Jean-Martial; Colin, Jeanne; Couvreux, Fleur; Cuxart, Joan; Delautier, Gaëlle; Dauhut, Thibaut; Ducrocq, Véronique; Filippi, Jean-Baptiste; Gazen, Didier; Geoffroy, Olivier; Gheusi, François; Honnert, Rachel; Lafore, Jean-Philippe; Lebeaupin Brossier, Cindy; Libois, Quentin; Lunet, Thibaut; Mari, Céline; Maric, Tomislav; Mascart, Patrick; Mogé, Maxime; Molinié, Gilles; Nuissier, Olivier; Pantillon, Florian; Peyrillé, Philippe; Pergaud, Julien; Perraud, Emilie; Pianezze, Joris; Redelsperger, Jean-Luc; Ricard, Didier; Richard, Evelyne; Riette, Sébastien; Rodier, Quentin; Schoetter, Robert; Seyfried, Léo; Stein, Joël; Suhre, Karsten; Taufour, Marie; Thouron, Odile; Turner, Sandra; Verrelle, Antoine; Vié, Benoît; Visentin, Florian; Vionnet, Vincent; Wautelet, Philippe
2018-05-01
This paper presents the Meso-NH model version 5.4. Meso-NH is an atmospheric non hydrostatic research model that is applied to a broad range of resolutions, from synoptic to turbulent scales, and is designed for studies of physics and chemistry. It is a limited-area model employing advanced numerical techniques, including monotonic advection schemes for scalar transport and fourth-order centered or odd-order WENO advection schemes for momentum. The model includes state-of-the-art physics parameterization schemes that are important to represent convective-scale phenomena and turbulent eddies, as well as flows at larger scales. In addition, Meso-NH has been expanded to provide capabilities for a range of Earth system prediction applications such as chemistry and aerosols, electricity and lightning, hydrology, wildland fires, volcanic eruptions, and cyclones with ocean coupling. Here, we present the main innovations to the dynamics and physics of the code since the pioneer paper of Lafore et al. (1998) and provide an overview of recent applications and couplings.
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Proceedings of the 13th biennial conference on carbon. Extended abstracts and program
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-01-01
Properties of carbon are covered including: mechanical and frictional properties; chemical reactivity and surfaces; aerospace applications; carbonization and graphitization; industrial applications; electrical and thermal properties; biomaterials applications; fibers and composites; nuclear applications; activated carbon and adsorption; advances in carbon characterization; and micromechanics and modeling. (GHT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lilienthal, P.
1997-12-01
This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less
Application Note: Power Grid Modeling With Xyce.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sholander, Peter E.
This application note describes how to model steady-state power flows and transient events in electric power grids with the SPICE-compatible Xyce TM Parallel Electronic Simulator developed at Sandia National Labs. This application notes provides a brief tutorial on the basic devices (branches, bus shunts, transformers and generators) found in power grids. The focus is on the features supported and assumptions made by the Xyce models for power grid elements. It then provides a detailed explanation, including working Xyce netlists, for simulating some simple power grid examples such as the IEEE 14-bus test case.
Near-Earth Space Radiation Models
NASA Technical Reports Server (NTRS)
Xapsos, Michael A.; O'Neill, Patrick M.; O'Brien, T. Paul
2012-01-01
Review of models of the near-Earth space radiation environment is presented, including recent developments in trapped proton and electron, galactic cosmic ray and solar particle event models geared toward spacecraft electronics applications.
Early Admissions at Selective Colleges. NBER Working Paper No. 14844
ERIC Educational Resources Information Center
Avery, Christopher; Levin, Jonathan D.
2009-01-01
Early admissions is widely used by selective colleges and universities. We identify some basic facts about early admissions policies, including the admissions advantage enjoyed by early applicants and patterns in application behavior, and propose a game-theoretic model that matches these facts. The key feature of the model is that colleges want to…
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
Application of Consider Covariance to the Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Lundberg, John B.
1996-01-01
The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.
NASA Astrophysics Data System (ADS)
Shugart, Herman H.; Wang, Bin; Fischer, Rico; Ma, Jianyong; Fang, Jing; Yan, Xiaodong; Huth, Andreas; Armstrong, Amanda H.
2018-03-01
Individual-based models (IBMs) of complex systems emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. Ecological IBMs arose with seemingly independent origins out of the tradition of understanding the ecosystems dynamics of ecosystems from a ‘bottom-up’ accounting of the interactions of the parts. Individual trees are principal among the parts of forests. Because these models are computationally demanding, they have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. This review will focus on a class of forest IBMs called gap models. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on a small plot of land. The summation of these plots comprise a forest (or set of sample plots on a forested landscape or region). Other, more aggregated forest IBMs have been used in global applications including cohort-based models, ecosystem demography models, etc. Gap models have been used to provide the parameters for these bulk models. Currently, gap models have grown from local-scale to continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. Our objective in this review is to provide the reader with an overview of the history, motivation and applications, including theoretical applications, of these models. In a time of concern over global changes, gap models are essential tools to understand forest responses to climate change, modified disturbance regimes and other change agents. Development of forest surveys to provide the starting points for simulations and better estimates of the behavior of the diversity of tree species in response to the environment are continuing needs for improvement for these and other IBMs.
Hierarchical models of very large problems, dilemmas, prospects, and an agenda for the future
NASA Technical Reports Server (NTRS)
Richardson, J. M., Jr.
1975-01-01
Interdisciplinary approaches to the modeling of global problems are discussed in terms of multilevel cooperation. A multilevel regionalized model of the Lake Erie Basin is analyzed along with a multilevel regionalized world modeling project. Other topics discussed include: a stratified model of interacting region in a world system, and the application of the model to the world food crisis in south Asia. Recommended research for future development of integrated models is included.
Centrifuge Rotor Models: A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach
NASA Technical Reports Server (NTRS)
Granda, Jose J.; Ramakrishnan, Jayant; Nguyen, Louis H.
2006-01-01
A viewgraph presentation on centrifuge rotor models with a comparison using Euler-Lagrange and bond graph methods is shown. The topics include: 1) Objectives; 2) MOdeling Approach Comparisons; 3) Model Structures; and 4) Application.
Spatio-Temporal Data Analysis at Scale Using Models Based on Gaussian Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Michael
Gaussian processes are the most commonly used statistical model for spatial and spatio-temporal processes that vary continuously. They are broadly applicable in the physical sciences and engineering and are also frequently used to approximate the output of complex computer models, deterministic or stochastic. We undertook research related to theory, computation, and applications of Gaussian processes as well as some work on estimating extremes of distributions for which a Gaussian process assumption might be inappropriate. Our theoretical contributions include the development of new classes of spatial-temporal covariance functions with desirable properties and new results showing that certain covariance models lead tomore » predictions with undesirable properties. To understand how Gaussian process models behave when applied to deterministic computer models, we derived what we believe to be the first significant results on the large sample properties of estimators of parameters of Gaussian processes when the actual process is a simple deterministic function. Finally, we investigated some theoretical issues related to maxima of observations with varying upper bounds and found that, depending on the circumstances, standard large sample results for maxima may or may not hold. Our computational innovations include methods for analyzing large spatial datasets when observations fall on a partially observed grid and methods for estimating parameters of a Gaussian process model from observations taken by a polar-orbiting satellite. In our application of Gaussian process models to deterministic computer experiments, we carried out some matrix computations that would have been infeasible using even extended precision arithmetic by focusing on special cases in which all elements of the matrices under study are rational and using exact arithmetic. The applications we studied include total column ozone as measured from a polar-orbiting satellite, sea surface temperatures over the Pacific Ocean, and annual temperature extremes at a site in New York City. In each of these applications, our theoretical and computational innovations were directly motivated by the challenges posed by analyzing these and similar types of data.« less
Interactive access and management for four-dimensional environmental data sets using McIDAS
NASA Technical Reports Server (NTRS)
Hibbard, William L.; Tripoli, Gregory J.
1993-01-01
Significant accomplishments in the past year are presented and include the following: (1) enhancements to VIS-5D; (2) Implementation of the VIS AD System; and (3) numerical modeling applications. Focus of current research and plans for next year in the following areas are briefly discussed: (1) continued development and application of the VIS-AD system; (2) further enhancements to VIS-5D; and (3) plans for modeling applications.
Integration of Web-based and PC-based clinical research databases.
Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M
2004-01-01
We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.
Applications of artificial neural networks (ANNs) in food science.
Huang, Yiqun; Kangas, Lars J; Rasco, Barbara A
2007-01-01
Artificial neural networks (ANNs) have been applied in almost every aspect of food science over the past two decades, although most applications are in the development stage. ANNs are useful tools for food safety and quality analyses, which include modeling of microbial growth and from this predicting food safety, interpreting spectroscopic data, and predicting physical, chemical, functional and sensory properties of various food products during processing and distribution. ANNs hold a great deal of promise for modeling complex tasks in process control and simulation and in applications of machine perception including machine vision and electronic nose for food safety and quality control. This review discusses the basic theory of the ANN technology and its applications in food science, providing food scientists and the research community an overview of the current research and future trend of the applications of ANN technology in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benbennick, M.E.; Broton, M.S.; Fuoto, J.S.
This report describes a model tracking system for a low-level radioactive waste (LLW) disposal facility license application. In particular, the model tracks interrogatories (questions, requests for information, comments) and responses. A set of requirements and desired features for the model tracking system was developed, including required structure and computer screens. Nine tracking systems were then reviewed against the model system requirements and only two were found to meet all requirements. Using Kepner-Tregoe decision analysis, a model tracking system was selected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Katherine H.; Cutler, Dylan S.; Olis, Daniel R.
REopt is a techno-economic decision support model used to optimize energy systems for buildings, campuses, communities, and microgrids. The primary application of the model is for optimizing the integration and operation of behind-the-meter energy assets. This report provides an overview of the model, including its capabilities and typical applications; inputs and outputs; economic calculations; technology descriptions; and model parameters, variables, and equations. The model is highly flexible, and is continually evolving to meet the needs of each analysis. Therefore, this report is not an exhaustive description of all capabilities, but rather a summary of the core components of the model.
Section 405 of the Clean Water Act requires the U.S. EPA to develop and issue regulations that identify: 1) uses for sludge including disposal; 2) specific factors (including costs) to be taken into account in determining the measures and practices applicable for each use or disp...
InSTREAM: the individual-based stream trout research and environmental assessment model
Steven F. Railsback; Bret C. Harvey; Stephen K. Jackson; Roland H. Lamberson
2009-01-01
This report documents Version 4.2 of InSTREAM, including its formulation, software, and application to research and management problems. InSTREAM is a simulation model designed to understand how stream and river salmonid populations respond to habitat alteration, including altered flow, temperature, and turbidity regimes and changes in channel morphology. The model...
The application of CFD to the modelling of fires in complex geometries
NASA Astrophysics Data System (ADS)
Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.
The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.
Boussinesq Modeling for Inlets, Harbors, and Structures (Bouss-2D)
2015-10-30
a wide variety of coastal and ocean engineering and naval architecture problems, including: transformation of waves over small to medium spatial...and outputs, and GIS data used in modeling. Recent applications include: Pillar Point Harbor, Oyster Point Marina, CA; Mouth of Columbia River
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2010-01-01
Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2011-01-01
Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
NASA Astrophysics Data System (ADS)
Binder, Claudia; Garcia-Santos, Glenda; Andreoli, Romano; Diaz, Jaime; Feola, Giuseppe; Wittensoeldner, Moritz; Yang, Jing
2016-04-01
This study presents an integrative and spatially explicit modeling approach for analyzing human and environmental exposure from pesticide application of smallholders in the potato producing Andean region in Colombia. The modeling approach fulfills the following criteria: (i) it includes environmental and human compartments; (ii) it contains a behavioral decision-making model for estimating the effect of policies on pesticide flows to humans and the environment; (iii) it is spatially explicit; and (iv) it is modular and easily expandable to include additional modules, crops or technologies. The model was calibrated and validated for the Vereda La Hoya and was used to explore the effect of different policy measures in the region. The model has moderate data requirements and can be adapted relatively easy to other regions in developing countries with similar conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less
Househ, Mowafa S.; Shubair, Mamdouh M.; Yunus, Faisel; Jamal, Amr; Aldossari, Bakheet
2015-01-01
Background: The aim of this paper is to present a usability analysis of the consumer ratings of key diabetes mHealth applications using an adapted Health IT Usability Evaluation Model (Health-ITUEM). Methods: A qualitative content analysis method was used to analyze publicly available consumer reported data posted on the Android Market and Google Play for four leading diabetes mHealth applications. Health-ITUEM concepts including information needs, flexibility/customizability, learnability, performance speed, and competency guided the categorization and analysis of the data. Health impact was an additional category that was included in the study. A total of 405 consumers’ ratings collected from January 9, 2014 to February 17, 2014 were included in the study. Results: Overall, the consumers’ ratings of the leading diabetes mHealth applications for both usability and health impacts were positive. The performance speed of the mHealth application and the information needs of the consumers were the primary usability factors impacting the use of the diabetes mHealth applications. There was also evidence on the positive health impacts of such applications. Conclusions: Consumers are more likely to use diabetes related mHealth applications that perform well and meet their information needs. Furthermore, there is preliminary evidence that diabetes mHealth applications can have positive impact on the health of patients. PMID:26635437
Househ, Mowafa S; Shubair, Mamdouh M; Yunus, Faisel; Jamal, Amr; Aldossari, Bakheet
2015-10-01
The aim of this paper is to present a usability analysis of the consumer ratings of key diabetes mHealth applications using an adapted Health IT Usability Evaluation Model (Health-ITUEM). A qualitative content analysis method was used to analyze publicly available consumer reported data posted on the Android Market and Google Play for four leading diabetes mHealth applications. Health-ITUEM concepts including information needs, flexibility/customizability, learnability, performance speed, and competency guided the categorization and analysis of the data. Health impact was an additional category that was included in the study. A total of 405 consumers' ratings collected from January 9, 2014 to February 17, 2014 were included in the study. Overall, the consumers' ratings of the leading diabetes mHealth applications for both usability and health impacts were positive. The performance speed of the mHealth application and the information needs of the consumers were the primary usability factors impacting the use of the diabetes mHealth applications. There was also evidence on the positive health impacts of such applications. Consumers are more likely to use diabetes related mHealth applications that perform well and meet their information needs. Furthermore, there is preliminary evidence that diabetes mHealth applications can have positive impact on the health of patients.
An investigation of modelling and design for software service applications.
Anjum, Maria; Budgen, David
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.
NASA Astrophysics Data System (ADS)
Hobbs, J.; Turmon, M.; David, C. H.; Reager, J. T., II; Famiglietti, J. S.
2017-12-01
NASA's Western States Water Mission (WSWM) combines remote sensing of the terrestrial water cycle with hydrological models to provide high-resolution state estimates for multiple variables. The effort includes both land surface and river routing models that are subject to several sources of uncertainty, including errors in the model forcing and model structural uncertainty. Computational and storage constraints prohibit extensive ensemble simulations, so this work outlines efficient but flexible approaches for estimating and reporting uncertainty. Calibrated by remote sensing and in situ data where available, we illustrate the application of these techniques in producing state estimates with associated uncertainties at kilometer-scale resolution for key variables such as soil moisture, groundwater, and streamflow.
Model selection for logistic regression models
NASA Astrophysics Data System (ADS)
Duller, Christine
2012-09-01
Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.
Approaches for the Application of Physiologically Based ...
This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.
Modeling of Diamond Field-Emitter-Arrays for high brightness photocathode applications
NASA Astrophysics Data System (ADS)
Kwan, Thomas; Huang, Chengkun; Piryatinski, Andrei; Lewellen, John; Nichols, Kimberly; Choi, Bo; Pavlenko, Vitaly; Shchegolkov, Dmitry; Nguyen, Dinh; Andrews, Heather; Simakov, Evgenya
2017-10-01
We propose to employ Diamond Field-Emitter-Arrays (DFEAs) as high-current-density ultra-low-emittance photocathodes for compact laser-driven dielectric accelerators capable of generating ultra-high brightness electron beams for advanced applications. We develop a semi-classical Monte-Carlo photoemission model for DFEAs that includes carriers' transport to the emitter surface and tunneling through the surface under external fields. The model accounts for the electronic structure size quantization affecting the transport and tunneling process within the sharp diamond tips. We compare this first principle model with other field emission models, such as the Child-Langmuir and Murphy-Good models. By further including effects of carrier photoexcitation, we perform simulations of the DFEAs' photoemission quantum yield and the emitted electron beam. Details of the theoretical model and validation against preliminary experimental data will be presented. Work ssupported by LDRD program at LANL.
Multilevel and Latent Variable Modeling with Composite Links and Exploded Likelihoods
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders
2007-01-01
Composite links and exploded likelihoods are powerful yet simple tools for specifying a wide range of latent variable models. Applications considered include survival or duration models, models for rankings, small area estimation with census information, models for ordinal responses, item response models with guessing, randomized response models,…
Computational Nanotechnology of Molecular Materials, Electronics and Machines
NASA Technical Reports Server (NTRS)
Srivastava, D.; Biegel, Bryan A. (Technical Monitor)
2002-01-01
This viewgraph presentation covers carbon nanotubes, their characteristics, and their potential future applications. The presentation include predictions on the development of nanostructures and their applications, the thermal characteristics of carbon nanotubes, mechano-chemical effects upon carbon nanotubes, molecular electronics, and models for possible future nanostructure devices. The presentation also proposes a neural model for signal processing.
Decision support systems and the healthcare strategic planning process: a case study.
Lundquist, D L; Norris, R M
1991-01-01
The repertoire of applications that comprises health-care decision support systems (DSS) includes analyses of clinical, financial, and operational activities. As a whole, these applications facilitate developing comprehensive and interrelated business and medical models that support the complex decisions required to successfully manage today's health-care organizations. Kennestone Regional Health Care System's use of DSS to facilitate strategic planning has precipitated marked changes in the organization's method of determining capital allocations. This case study discusses Kennestone's use of DSS in the strategic planning process, including profiles of key DSS modeling components.
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
Dataset for petroleum based stock markets and GAUSS codes for SAMEM.
Khalifa, Ahmed A A; Bertuccelli, Pietro; Otranto, Edoardo
2017-02-01
This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday) for oil and natural gas volatility and the oil rich economies' stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006-July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM) with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.
NASA Astrophysics Data System (ADS)
Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.
2017-03-01
Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.
Diffuse-Interface Methods in Fluid Mechanics
NASA Technical Reports Server (NTRS)
Anderson, D. M.; McFadden, G. B.; Wheeler, A. A.
1997-01-01
The authors review the development of diffuse-interface models of hydrodynamics and their application to a wide variety of interfacial phenomena. The authors discuss the issues involved in formulating diffuse-interface models for single-component and binary fluids. Recent applications and computations using these models are discussed in each case. Further, the authors address issues including sharp-interface analyses that relate these models to the classical free-boundary problem, related computational approaches to describe interfacial phenomena, and related approaches describing fully-miscible fluids.
Radial basis function and its application in tourism management
NASA Astrophysics Data System (ADS)
Hu, Shan-Feng; Zhu, Hong-Bin; Zhao, Lei
2018-05-01
In this work, several applications and the performances of the radial basis function (RBF) are briefly reviewed at first. After that, the binomial function combined with three different RBFs including the multiquadric (MQ), inverse quadric (IQ) and inverse multiquadric (IMQ) distributions are adopted to model the tourism data of Huangshan in China. Simulation results showed that all the models match very well with the sample data. It is found that among the three models, the IMQ-RBF model is more suitable for forecasting the tourist flow.
Frontal view reconstruction for iris recognition
Santos-Villalobos, Hector J; Bolme, David S; Boehnen, Chris Bensing
2015-02-17
Iris recognition can be accomplished for a wide variety of eye images by correcting input images with an off-angle gaze. A variety of techniques, from limbus modeling, corneal refraction modeling, optical flows, and genetic algorithms can be used. A variety of techniques, including aspherical eye modeling, corneal refraction modeling, ray tracing, and the like can be employed. Precomputed transforms can enhance performance for use in commercial applications. With application of the technologies, images with significantly unfavorable gaze angles can be successfully recognized.
A Distributed Snow Evolution Modeling System (SnowModel)
NASA Astrophysics Data System (ADS)
Liston, G. E.; Elder, K.
2004-12-01
A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.
Perl Embedded in PTC's Pro/ENGINEER, Version 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
2003-12-22
Pro-PERL (AKA Pro/PERL) is a Perl extension to the PTC Pro/TOOLKIT API to the PTC Pro/ENGINEER CAD application including an embedded interpreter. It can be used to automate and customize Pro/ENGINEER, create Vendor Neutral Archive (VNA) format files and re-create CAD models from the VNA files. This has applications in sanitizing classified CAD models created in a classified environment for transfer to an open environment, creating template models for modification to finished models by non-expert users, and transfer of design intent data to other modeling technologies.
Measurement of the hyperelastic properties of 44 pathological ex vivo breast tissue samples
NASA Astrophysics Data System (ADS)
O'Hagan, Joseph J.; Samani, Abbas
2009-04-01
The elastic and hyperelastic properties of biological soft tissues have been of interest to the medical community. There are several biomedical applications where parameters characterizing such properties are critical for a reliable clinical outcome. These applications include surgery planning, needle biopsy and brachtherapy where tissue biomechanical modeling is involved. Another important application is interpreting nonlinear elastography images. While there has been considerable research on the measurement of the linear elastic modulus of small tissue samples, little research has been conducted for measuring parameters that characterize the nonlinear elasticity of tissues included in tissue slice specimens. This work presents hyperelastic measurement results of 44 pathological ex vivo breast tissue samples. For each sample, five hyperelastic models have been used, including the Yeoh, N = 2 polynomial, N = 1 Ogden, Arruda-Boyce, and Veronda-Westmann models. Results show that the Yeoh, polynomial and Ogden models are the most accurate in terms of fitting experimental data. The results indicate that almost all of the parameters corresponding to the pathological tissues are between two times to over two orders of magnitude larger than those of normal tissues, with C11 showing the most significant difference. Furthermore, statistical analysis indicates that C02 of the Yeoh model, and C11 and C20 of the polynomial model have very good potential for cancer classification as they show statistically significant differences for various cancer types, especially for invasive lobular carcinoma. In addition to the potential for use in cancer classification, the presented data are very important for applications such as surgery planning and virtual reality based clinician training systems where accurate nonlinear tissue response modeling is required.
Nonlinear ultrasonic pulsed measurements and applications to metal processing and fatigue
NASA Astrophysics Data System (ADS)
Yost, William T.; Cantrell, John H.; Na, Jeong K.
2001-04-01
Nonlinear ultrasonics research at NASA-Langley Research Center emphasizes development of experimental techniques and modeling, with applications to metal fatigue and metals processing. This review work includes a summary of results from our recent efforts in technique refinement, modeling of fatigue related microstructure contributions, and measurements on fatigued turbine blades. Also presented are data on 17-4PH and 410-Cb stainless steels. The results are in good agreement with the models.
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
NASA Astrophysics Data System (ADS)
Irwan; Gustientiedina; Sunarti; Desnelita, Yenny
2017-12-01
The purpose of this study is to design a counseling model application for a decision-maker and consultation system. This application as an alternative guidance and individual career development for students, that include career knowledge, planning and alternative options from an expert tool based on knowledge and rule to provide the solutions on student’s career decisions. This research produces a counseling model application to obtain the important information about student career development and facilitating individual student’s development through the service form, to connect their plan with their career according to their talent, interest, ability, knowledge, personality and other supporting factors. This application model can be used as tool to get information faster and flexible for the student’s guidance and counseling. So, it can help students in doing selection and making decision that appropriate with their choice of works.
Virtual Libraries: Interactive Support Software and an Application in Chaotic Models.
ERIC Educational Resources Information Center
Katsirikou, Anthi; Skiadas, Christos; Apostolou, Apostolos; Rompogiannakis, Giannis
This paper begins with a discussion of the characteristics and the singularity of chaotic systems, including dynamic systems theory, chaotic orbit, fractals, chaotic attractors, and characteristics of chaotic systems. The second section addresses the digital libraries (DL) concept and the appropriateness of chaotic models, including definition and…
Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.
Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E
2009-08-25
Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.
Machine Learning Methods for Analysis of Metabolic Data and Metabolic Pathway Modeling
Cuperlovic-Culf, Miroslava
2018-01-01
Machine learning uses experimental data to optimize clustering or classification of samples or features, or to develop, augment or verify models that can be used to predict behavior or properties of systems. It is expected that machine learning will help provide actionable knowledge from a variety of big data including metabolomics data, as well as results of metabolism models. A variety of machine learning methods has been applied in bioinformatics and metabolism analyses including self-organizing maps, support vector machines, the kernel machine, Bayesian networks or fuzzy logic. To a lesser extent, machine learning has also been utilized to take advantage of the increasing availability of genomics and metabolomics data for the optimization of metabolic network models and their analysis. In this context, machine learning has aided the development of metabolic networks, the calculation of parameters for stoichiometric and kinetic models, as well as the analysis of major features in the model for the optimal application of bioreactors. Examples of this very interesting, albeit highly complex, application of machine learning for metabolism modeling will be the primary focus of this review presenting several different types of applications for model optimization, parameter determination or system analysis using models, as well as the utilization of several different types of machine learning technologies. PMID:29324649
Machine Learning Methods for Analysis of Metabolic Data and Metabolic Pathway Modeling.
Cuperlovic-Culf, Miroslava
2018-01-11
Machine learning uses experimental data to optimize clustering or classification of samples or features, or to develop, augment or verify models that can be used to predict behavior or properties of systems. It is expected that machine learning will help provide actionable knowledge from a variety of big data including metabolomics data, as well as results of metabolism models. A variety of machine learning methods has been applied in bioinformatics and metabolism analyses including self-organizing maps, support vector machines, the kernel machine, Bayesian networks or fuzzy logic. To a lesser extent, machine learning has also been utilized to take advantage of the increasing availability of genomics and metabolomics data for the optimization of metabolic network models and their analysis. In this context, machine learning has aided the development of metabolic networks, the calculation of parameters for stoichiometric and kinetic models, as well as the analysis of major features in the model for the optimal application of bioreactors. Examples of this very interesting, albeit highly complex, application of machine learning for metabolism modeling will be the primary focus of this review presenting several different types of applications for model optimization, parameter determination or system analysis using models, as well as the utilization of several different types of machine learning technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogue, J; Parsai, E
Purpose: The current generation of inflatable multichannel brachytherapy applicators, such as the Varian Capri, have limited implementation to only vaginal and rectal cancers. While there are similar designs utilizing rigid, non-inflatable applicators, these alternatives could cause increased dose to surrounding tissue due to air gaps. Modification of the Capri could allow for easier treatment planning by reducing the number of channels and increased versatility by modifying the applicator to include an attachable single tandem for cervical or multiple tandems for endometrial applications. Methods: A Varian Capri applicator was simulated in water to replicate a patient. Multiple plans were optimized tomore » deliver a prescribed dose of 100 cGy at 5mm away from the exterior of the applicator using six to thirteen existing channels. The current model was expanded upon to include a detachable tandem or multiple tandoms to increase its functionality to both cervical and endometrial cancers. Models were constructed in both threedimensional rendering software and Monte Carlo to allow prototyping and simulations. Results: Treatment plans utilizing six to thirteen channels produced limited dosimetric differences between channel arrangements, with a seven channel plan very closely approximating the thirteen channels. It was concluded that only seven channels would be necessary in future simulations to give an accurate representation of the applicator. Tandem attachments were prototyped for the applicator to demonstrate the ease of which they could be included. Future simulation in treatment planning software and Monte Carlo results will be presented to further define the ideal applicator geometry Conclusion: The current Capri applicator design could be easily modified to increase applicability to include cervical and endometrial treatments in addition to vaginal and rectal cancers. This new design helps in a more versatile single use applicator that can easily be inserted and to further reduce dose to critical structures during brachytherapy treatments.« less
Modeling wildlife populations with HexSim
HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications including population viability analysis for on...
Modelling biogas production of solid waste: application of the BGP model to a synthetic landfill
NASA Astrophysics Data System (ADS)
Rodrigo-Ilarri, Javier; Segura-Sobrino, Francisco
2013-04-01
Production of biogas as a result of the decomposition of organic matter included on solid waste landfills is still an issue to be understood. Reports on this matter are rarely included on the engineering construction projects of solid waste landfills despite it can be an issue of critical importance while operating the landfill and after its closure. This paper presents an application of BGP (Bio-Gas-Production) model to a synthetic landfill. The evolution in time of the concentrations of the different chemical compounds of biogas is studied. Results obtained show the impact on the air quality of different management alternatives which are usually performed in real landfills.
NASA Technical Reports Server (NTRS)
Parton, William J.; Ojima, Dennis S.; Schimel, David S.; Kittel, Timothy G. F.
1992-01-01
During the past decade, a growing need to conduct regional assessments of long-term trends of ecosystem behavior and the technology to meet this need have converged. The Century model is the product of research efforts initially intended to develop a general model of plant-soil ecosystem dynamics for the North American central grasslands. This model is now being used to simulate plant production, nutrient cycling, and soil organic matter dynamics for grassland, crop, forest, and shrub ecosystems in various regions of the world, including temperate and tropical ecosystems. This paper will focus on the philosophical approach used to develop the structure of Century. The steps included were model simplification, parameterization, and testing. In addition, the importance of acquiring regional data bases for model testing and the present regional application of Century in the Great Plains, which focus on regional ecosystem dynamics and the effect of altering environmental conditions, are discussed.
CFD Modeling Activities at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Allgood, Daniel
2007-01-01
A viewgraph presentation on NASA Stennis Space Center's Computational Fluid Dynamics (CFD) Modeling activities is shown. The topics include: 1) Overview of NASA Stennis Space Center; 2) Role of Computational Modeling at NASA-SSC; 3) Computational Modeling Tools and Resources; and 4) CFD Modeling Applications.
Review of game theory applications for situation awareness
NASA Astrophysics Data System (ADS)
Blasch, Erik; Shen, Dan; Pham, Khanh D.; Chen, Genshe
2015-05-01
Game theoretical methods have been used for spectral awareness, space situational awareness (SSA), cyber situational awareness (CSA), and Intelligence, Surveillance, and Reconnaissance situation awareness (ISA). Each of these cases, awareness is supported by sensor estimation for assessment and the situation is determined from the actions of multiple players. Game theory assumes rational actors in a defined scenario; however, variations in social, cultural and behavioral factors include the dynamic nature of the context. In a dynamic data-driven application system (DDDAS), modeling must include both the measurements but also how models are used by different actors with different priorities. In this paper, we highlight the applications of game theory by reviewing the literature to determine the current state of the art and future needs. Future developments would include building towards knowledge awareness with information technology (e.g., data aggregation, access, indexing); multiscale analysis (e.g., space, time, and frequency), and software methods (e.g., architectures, cloud computing, protocols).
Finite elements of nonlinear continua.
NASA Technical Reports Server (NTRS)
Oden, J. T.
1972-01-01
The finite element method is extended to a broad class of practical nonlinear problems, treating both theory and applications from a general and unifying point of view. The thermomechanical principles of continuous media and the properties of the finite element method are outlined, and are brought together to produce discrete physical models of nonlinear continua. The mathematical properties of the models are analyzed, and the numerical solution of the equations governing the discrete models is examined. The application of the models to nonlinear problems in finite elasticity, viscoelasticity, heat conduction, and thermoviscoelasticity is discussed. Other specific topics include the topological properties of finite element models, applications to linear and nonlinear boundary value problems, convergence, continuum thermodynamics, finite elasticity, solutions to nonlinear partial differential equations, and discrete models of the nonlinear thermomechanical behavior of dissipative media.
Artificial intelligence based models for stream-flow forecasting: 2000-2015
NASA Astrophysics Data System (ADS)
Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba
2015-11-01
The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.
76 FR 39256 - Airworthiness Directives; Dassault Aviation Model FALCON 7X Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-06
... include modifying the applicable wiring and layout, a general visual inspection for absence of marks of... March 3, 2010. (2) Modify the applicable wiring and layout, in accordance with the Accomplishment... modifying the applicable wiring and layout, in accordance with Dassault Mandatory Service Bulletin 7X- 006...
Applications of Parsing Theory to Computer-Assisted Instruction.
ERIC Educational Resources Information Center
Markosian, Lawrence Z.; Ager, Tryg A.
1983-01-01
Applications of an LR-1 parsing algorithm to intelligent programs for computer assisted instruction in symbolic logic and foreign languages are discussed. The system has been adequately used for diverse instructional applications, including analysis of student input, generation of pattern drills, and modeling the student's understanding of the…
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
Application of variable-gain output feedback for high-alpha control
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.
1990-01-01
A variable-gain, optimal, discrete, output feedback design approach that is applied to a nonlinear flight regime is described. The flight regime covers a wide angle-of-attack range that includes stall and post stall. The paper includes brief descriptions of the variable-gain formulation, the discrete-control structure and flight equations used to apply the design approach, and the high performance airplane model used in the application. Both linear and nonlinear analysis are shown for a longitudinal four-model design case with angles of attack of 5, 15, 35, and 60 deg. Linear and nonlinear simulations are compared for a single-point longitudinal design at 60 deg angle of attack. Nonlinear simulations for the four-model, multi-mode, variable-gain design include a longitudinal pitch-up and pitch-down maneuver and high angle-of-attack regulation during a lateral maneuver.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Every year, each manufacturer of passenger cars, light-duty trucks, motorcycles, or heavy-duty engines submits to EPA an application for certification. In the application, the manufacturer gives a detailed technical description of the vehicles or engines he intends to market during the upcoming model year. These engineering data include explanations and/or drawings that describe engine/vehicle parameters such as basic engine design, fuel systems, ignition systems, and exhaust and evaporative emission control systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Micah Johnson, Andrew Slaughter
PIKA is a MOOSE-based application for modeling micro-structure evolution of seasonal snow. The model will be useful for environmental, atmospheric, and climate scientists. Possible applications include application to energy balance models, ice sheet modeling, and avalanche forecasting. The model implements physics from published, peer-reviewed articles. The main purpose is to foster university and laboratory collaboration to build a larger multi-scale snow model using MOOSE. The main feature of the code is that it is implemented using the MOOSE framework, thus making features such as multiphysics coupling, adaptive mesh refinement, and parallel scalability native to the application. PIKA implements three equations:more » the phase-field equation for tracking the evolution of the ice-air interface within seasonal snow at the grain-scale; the heat equation for computing the temperature of both the ice and air within the snow; and the mass transport equation for monitoring the diffusion of water vapor in the pore space of the snow.« less
Assessing the Application of a Geographic Presence-Only Model for Land Suitability Mapping
Heumann, Benjamin W.; Walsh, Stephen J.; McDaniel, Phillip M.
2011-01-01
Recent advances in ecological modeling have focused on novel methods for characterizing the environment that use presence-only data and machine-learning algorithms to predict the likelihood of species occurrence. These novel methods may have great potential for land suitability applications in the developing world where detailed land cover information is often unavailable or incomplete. This paper assesses the adaptation and application of the presence-only geographic species distribution model, MaxEnt, for agricultural crop suitability mapping in a rural Thailand where lowland paddy rice and upland field crops predominant. To assess this modeling approach, three independent crop presence datasets were used including a social-demographic survey of farm households, a remote sensing classification of land use/land cover, and ground control points, used for geodetic and thematic reference that vary in their geographic distribution and sample size. Disparate environmental data were integrated to characterize environmental settings across Nang Rong District, a region of approximately 1,300 sq. km in size. Results indicate that the MaxEnt model is capable of modeling crop suitability for upland and lowland crops, including rice varieties, although model results varied between datasets due to the high sensitivity of the model to the distribution of observed crop locations in geographic and environmental space. Accuracy assessments indicate that model outcomes were influenced by the sample size and the distribution of sample points in geographic and environmental space. The need for further research into accuracy assessments of presence-only models lacking true absence data is discussed. We conclude that the Maxent model can provide good estimates of crop suitability, but many areas need to be carefully scrutinized including geographic distribution of input data and assessment methods to ensure realistic modeling results. PMID:21860606
Li, Zhengkai; Spaulding, Malcolm; French McCay, Deborah; Crowley, Deborah; Payne, James R
2017-01-15
An oil droplet size model was developed for a variety of turbulent conditions based on non-dimensional analysis of disruptive and restorative forces, which is applicable to oil droplet formation under both surface breaking-wave and subsurface-blowout conditions, with or without dispersant application. This new model was calibrated and successfully validated with droplet size data obtained from controlled laboratory studies of dispersant-treated and non-treated oil in subsea dispersant tank tests and field surveys, including the Deep Spill experimental release and the Deepwater Horizon blowout oil spill. This model is an advancement over prior models, as it explicitly addresses the effects of the dispersed phase viscosity, resulting from dispersant application and constrains the maximum stable droplet size based on Rayleigh-Taylor instability that is invoked for a release from a large aperture. Copyright © 2016 Elsevier Ltd. All rights reserved.
An investigation of modelling and design for software service applications
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
Operation of the computer model for microenvironment atomic oxygen exposure
NASA Technical Reports Server (NTRS)
Bourassa, R. J.; Gillis, J. R.; Gruenbaum, P. E.
1995-01-01
A computer model for microenvironment atomic oxygen exposure has been developed to extend atomic oxygen modeling capability to include shadowing and reflections. The model uses average exposure conditions established by the direct exposure model and extends the application of these conditions to treat surfaces of arbitrary shape and orientation.
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James
This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
NASA Technical Reports Server (NTRS)
Johnson, Barry
1992-01-01
The topics covered include the following: (1) CO2 laser kinetics modeling; (2) gas lifetimes in pulsed CO2 lasers; (3) frequency chirp and laser pulse spectral analysis; (4) LAWS A' Design Study; and (5) discharge circuit components for LAWS. The appendices include LAWS Memos, computer modeling of pulsed CO2 lasers for lidar applications, discharge circuit considerations for pulsed CO2 lidars, and presentation made at the Code RC Review.
Design for inadvertent damage in composite laminates
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Chamis, Christos C.
1992-01-01
Simplified predictive methods and models to computationally simulate durability and damage in polymer matrix composite materials/structures are described. The models include (1) progressive fracture, (2) progressively damaged structural behavior, (3) progressive fracture in aggressive environments, (4) stress concentrations, and (5) impact resistance. Several examples are included to illustrate applications of the models and to identify significant parameters and sensitivities. Comparisons with limited experimental data are made.
ABSTRACT
Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...
pyomocontrib_simplemodel v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, William
2017-03-02
Pyomo supports the formulation and analysis of mathematical models for complex optimization applications. This library extends the API of Pyomo to include a simple modeling representation: a list of objectives and constraints.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik
1991-01-01
A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.
Zhang, Zhenjun; Li, Yang; Liao, Zhenhua; Liu, Weiqiang
2016-12-01
Based on the application of finite element analysis in spine biomechanics,the research progress of finite element method applied in lumbar spine mechanics is reviewed and the prospect is forecasted.The related works,including lumbar ontology modeling,clinical application research,and occupational injury and protection,are summarized.The main research areas of finite element method are as follows:new accurate modeling process,the optimized simulation method,diversified clinical effect evaluation,and the clinical application of artificial lumbar disc.According to the recent research progress,the application prospects of finite element method,such as automation and individuation of modeling process,evaluation and analysis of new operation methods and simulation of mechanical damage and dynamic response,are discussed.The purpose of this paper is to provide the theoretical reference and practical guidance for the clinical lumbar problems by reviewing the application of finite element method in the field of the lumbar spine biomechanics.
Hansen, Trine Lund; Bhander, Gurbakhash S; Christensen, Thomas Højlund; Bruun, Sander; Jensen, Lars Stoumann
2006-04-01
A model capable of quantifying the potential environmental impacts of agricultural application of composted or anaerobically digested source-separated organic municipal solid waste (MSW) is presented. In addition to the direct impacts, the model accounts for savings by avoiding the production and use of commercial fertilizers. The model is part of a larger model, Environmental Assessment of Solid Waste Systems and Technology (EASEWASTE), developed as a decision-support model, focusing on assessment of alternative waste management options. The environmental impacts of the land application of processed organic waste are quantified by emission coefficients referring to the composition of the processed waste and related to specific crop rotation as well as soil type. The model contains several default parameters based on literature data, field experiments and modelling by the agro-ecosystem model, Daisy. All data can be modified by the user allowing application of the model to other situations. A case study including four scenarios was performed to illustrate the use of the model. One tonne of nitrogen in composted and anaerobically digested MSW was applied as fertilizer to loamy and sandy soil at a plant farm in western Denmark. Application of the processed organic waste mainly affected the environmental impact categories global warming (0.4-0.7 PE), acidification (-0.06 (saving)-1.6 PE), nutrient enrichment (-1.0 (saving)-3.1 PE), and toxicity. The main contributors to these categories were nitrous oxide formation (global warming), ammonia volatilization (acidification and nutrient enrichment), nitrate losses (nutrient enrichment and groundwater contamination), and heavy metal input to soil (toxicity potentials). The local agricultural conditions as well as the composition of the processed MSW showed large influence on the environmental impacts. A range of benefits, mainly related to improved soil quality from long-term application of the processed organic waste, could not be generally quantified with respect to the chosen life cycle assessment impact categories and were therefore not included in the model. These effects should be considered in conjunction with the results of the life cycle assessment.
Knowledge Interaction Design for Creative Knowledge Work
NASA Astrophysics Data System (ADS)
Nakakoji, Kumiyo; Yamamoto, Yasuhiro
This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.
Earth Survey Applications Division. [a bibliography
NASA Technical Reports Server (NTRS)
Carpenter, L. (Editor)
1981-01-01
Accomplishments of research and data analysis conducted to study physical parameters and processes inside the Earth and on the Earth's surface, to define techniques and systems for remotely sensing the processes and measuring the parameters of scientific and applications interest, and the transfer of promising operational applications techniques to the user community of Earth resources monitors, managers, and decision makers are described. Research areas covered include: geobotany, magnetic field modeling, crustal studies, crustal dynamics, sea surface topography, land resources, remote sensing of vegetation and soils, and hydrological sciences. Major accomplishments include: production of global maps of magnetic anomalies using Magsat data; computation of the global mean sea surface using GEOS-3 and Seasat altimetry data; delineation of the effects of topography on the interpretation of remotely-sensed data; application of snowmelt runoff models to water resources management; and mapping of snow depth over wheat growing areas using Nimbus microwave data.
Environment Modeling Using Runtime Values for JPF-Android
NASA Technical Reports Server (NTRS)
van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem
2015-01-01
Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.
Uncertainty, ensembles and air quality dispersion modeling: applications and challenges
NASA Astrophysics Data System (ADS)
Dabberdt, Walter F.; Miller, Erik
The past two decades have seen significant advances in mesoscale meteorological modeling research and applications, such as the development of sophisticated and now widely used advanced mesoscale prognostic models, large eddy simulation models, four-dimensional data assimilation, adjoint models, adaptive and targeted observational strategies, and ensemble and probabilistic forecasts. Some of these advances are now being applied to urban air quality modeling and applications. Looking forward, it is anticipated that the high-priority air quality issues for the near-to-intermediate future will likely include: (1) routine operational forecasting of adverse air quality episodes; (2) real-time high-level support to emergency response activities; and (3) quantification of model uncertainty. Special attention is focused here on the quantification of model uncertainty through the use of ensemble simulations. Application to emergency-response dispersion modeling is illustrated using an actual event that involved the accidental release of the toxic chemical oleum. Both surface footprints of mass concentration and the associated probability distributions at individual receptors are seen to provide valuable quantitative indicators of the range of expected concentrations and their associated uncertainty.
Nonlinear modeling of chaotic time series: Theory and applications
NASA Astrophysics Data System (ADS)
Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.
We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.
Skill Assessment for Coupled Biological/Physical Models of Marine Systems.
Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip
2009-02-20
Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.
In vitro skin models and tissue engineering protocols for skin graft applications.
Naves, Lucas B; Dhand, Chetna; Almeida, Luis; Rajamani, Lakshminarayanan; Ramakrishna, Seeram
2016-11-30
In this review, we present a brief introduction of the skin structure, a concise compilation of skin-related disorders, and a thorough discussion of different in vitro skin models, artificial skin substitutes, skin grafts, and dermal tissue engineering protocols. The advantages of the development of in vitro skin disorder models, such as UV radiation and the prototype model, melanoma model, wound healing model, psoriasis model, and full-thickness model are also discussed. Different types of skin grafts including allografts, autografts, allogeneic, and xenogeneic are described in detail with their associated applications. We also discuss different tissue engineering protocols for the design of various types of skin substitutes and their commercial outcomes. Brief highlights are given of the new generation three-dimensional printed scaffolds for tissue regeneration applications. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.
Fortran Programs for Weapon Systems Analysis
1990-06-01
interested in ballistics and related work. The programs include skeletal combat models , a set of discrete-event timing routines, mathematical and...32 4.3 LinEqs: Solve Linear Equations Like a Textbook ........................................................................... 34...military applications as it is of computer science. This crisis occurs in all fields, including the modeling of logistics, mobility, ballistics, and combat
Design and implementation of space physics multi-model application integration based on web
NASA Astrophysics Data System (ADS)
Jiang, Wenping; Zou, Ziming
With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.
NASA Astrophysics Data System (ADS)
Darema, F.
2016-12-01
InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.
NASA Data for Water Resources Applications
NASA Technical Reports Server (NTRS)
Toll, David; Houser, Paul; Arsenault, Kristi; Entin, Jared
2004-01-01
Water Management Applications is one of twelve elements in the Earth Science Enterprise National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of: 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies including the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. This includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being eliminated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems WAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification validation. Water Management Applications is one of twelve elements in the Earth Science Enterprise s National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. T us includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being evaluated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems (LDAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification and validation.
The atmospheric boundary layer — advances in knowledge and application
NASA Astrophysics Data System (ADS)
Garratt, J. R.; Hess, G. D.; Physick, W. L.; Bougeault, P.
1996-02-01
We summarise major activities and advances in boundary-layer knowledge in the 25 years since 1970, with emphasis on the application of this knowledge to surface and boundary-layer parametrisation schemes in numerical models of the atmosphere. Progress in three areas is discussed: (i) the mesoscale modelling of selected phenomena; (ii) numerical weather prediction; and (iii) climate simulations. Future trends are identified, including the incorporation into models of advanced cloud schemes and interactive canopy schemes, and the nesting of high resolution boundary-layer schemes in global climate models.
Assessment and application of Reynolds stress closure models to high-speed compressible flows
NASA Technical Reports Server (NTRS)
Gatski, T. B.; Sarkar, S.; Speziale, C. G.; Balakrishnan, L.; Abid, R.; Anderson, E. C.
1990-01-01
The paper presents results from the development of higher order closure models for the phenomological modeling of high-speed compressible flows. The work presented includes the introduction of an improved pressure-strain correlationi model applicable in both the low- and high-speed regime as well as modifications to the isotropic dissipation rate to account for dilatational effects. Finally, the question of stiffness commonly associated with the solution of two-equation and Reynolds stress transport equations in wall-bounded flows is examined and ways of relaxing these restrictions are discussed.
Common IED exploitation target set ontology
NASA Astrophysics Data System (ADS)
Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William
2010-04-01
The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.
Detailed Modeling of Physical Processes in Electron Sources for Accelerator Applications
NASA Astrophysics Data System (ADS)
Chubenko, Oksana; Afanasev, Andrei
2017-01-01
At present, electron sources are essential in a wide range of applications - from common technical use to exploring the nature of matter. Depending on the application requirements, different methods and materials are used to generate electrons. State-of-the-art accelerator applications set a number of often-conflicting requirements for electron sources (e.g., quantum efficiency vs. polarization, current density vs. lifetime, etc). Development of advanced electron sources includes modeling and design of cathodes, material growth, fabrication of cathodes, and cathode testing. The detailed simulation and modeling of physical processes is required in order to shed light on the exact mechanisms of electron emission and to develop new-generation electron sources with optimized efficiency. The purpose of the present work is to study physical processes in advanced electron sources and develop scientific tools, which could be used to predict electron emission from novel nano-structured materials. In particular, the area of interest includes bulk/superlattice gallium arsenide (bulk/SL GaAs) photo-emitters and nitrogen-incorporated ultrananocrystalline diamond ((N)UNCD) photo/field-emitters. Work supported by The George Washington University and Euclid TechLabs LLC.
Complete modeling of rotary ultrasonic motors actuated by traveling flexural waves
NASA Astrophysics Data System (ADS)
Bao, Xiaoqi; Bar-Cohen, Yoseph
2000-06-01
Ultrasonic rotary motors have the potential to meet this NASA need and they are developed as actuators for miniature telerobotic applications. These motors are being adapted for operation at the harsh space environments that include cryogenic temperatures and vacuum and analytical tools for the design of efficient motors are being developed. A hybrid analytical model was developed to address a complete ultrasonic motor as a system. Included in this model is the influence of the rotor dynamics, which was determined experimentally to be important to the motor performance. The analysis employs a 3D finite element model to express the dynamic characteristics of the stator with piezoelectric elements and the rotor. The details of the stator including the teeth, piezoelectric ceramic, geometry, bonding layer, etc. are included to support practical USM designs. A brush model is used for the interface layer and Coulomb's law for the friction between the stator and the rotor. The theoretical predictions were corroborated experimentally for the motor. In parallel, efforts have been made to determine the thermal and vacuum performance of these motors. To explore telerobotic applications for USMs a robotic arm was constructed with such motors.
Auditory models for speech analysis
NASA Astrophysics Data System (ADS)
Maybury, Mark T.
This paper reviews the psychophysical basis for auditory models and discusses their application to automatic speech recognition. First an overview of the human auditory system is presented, followed by a review of current knowledge gleaned from neurological and psychoacoustic experimentation. Next, a general framework describes established peripheral auditory models which are based on well-understood properties of the peripheral auditory system. This is followed by a discussion of current enhancements to that models to include nonlinearities and synchrony information as well as other higher auditory functions. Finally, the initial performance of auditory models in the task of speech recognition is examined and additional applications are mentioned.
Energy Models and the Policy Process.
ERIC Educational Resources Information Center
De Man, Reinier
1983-01-01
Describes the function of econometric and technological models in the policy process, and shows how different positions in the Dutch energy discussion are reflected by the application of different model methodologies. Discussion includes the energy policy context, a conceptual framework for using energy models, and energy scenarios in policy…
A theoretical study of radar return and radiometric emission from the sea
NASA Technical Reports Server (NTRS)
Peake, W. H.
1972-01-01
The applicability of the various electromagnetic models of scattering from the ocean are reviewed. These models include the small perturbation method, the geometric optics solution, the composite model, and the exact integral equation solution. The restrictions on the electromagnetic models are discussed.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...
Regional CMS Modeling: Southwest Florida Gulf Coast
2016-05-01
by Kelly R. Legault and Tanya M. Beck PURPOSE: This Coastal and Hydraulics Engineering technical note (CHETN) describes a regional application of...the U.S. Army Corps of Engineers (USACE) Engineer Research and Development Center (ERDC) Coastal Modeling System (CMS). This application spans three...Active federal projects include the Pinellas County Shore Protection Project (SPP), Tampa Harbor Deeping Project, Manatee County SPP at Anna Maria
Multi-sensor Improved Sea-Surface Temperature (MISST) for IOOS - Navy Component
2013-09-30
application and data fusion techniques. 2. Parameterization of IR and MW retrieval differences, with consideration of diurnal warming and cool-skin effects...associated retrieval confidence, standard deviation (STD), and diurnal warming estimates to the application user community in the new GDS 2.0 GHRSST...including coral reefs, ocean modeling in the Gulf of Mexico, improved lake temperatures, numerical data assimilation by ocean models, numerical
NASA Technical Reports Server (NTRS)
Taylor, Lawrence W., Jr.; Rajiyah, H.
1991-01-01
Partial differential equations for modeling the structural dynamics and control systems of flexible spacecraft are applied here in order to facilitate systems analysis and optimization of these spacecraft. Example applications are given, including the structural dynamics of SCOLE, the Solar Array Flight Experiment, the Mini-MAST truss, and the LACE satellite. The development of related software is briefly addressed.
Influence of safety measures on the risks of transporting dangerous goods through road tunnels.
Saccomanno, Frank; Haastrup, Palle
2002-12-01
Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.
Review of Zero-D and 1-D Models of Blood Flow in the Cardiovascular System
2011-01-01
Background Zero-dimensional (lumped parameter) and one dimensional models, based on simplified representations of the components of the cardiovascular system, can contribute strongly to our understanding of circulatory physiology. Zero-D models provide a concise way to evaluate the haemodynamic interactions among the cardiovascular organs, whilst one-D (distributed parameter) models add the facility to represent efficiently the effects of pulse wave transmission in the arterial network at greatly reduced computational expense compared to higher dimensional computational fluid dynamics studies. There is extensive literature on both types of models. Method and Results The purpose of this review article is to summarise published 0D and 1D models of the cardiovascular system, to explore their limitations and range of application, and to provide an indication of the physiological phenomena that can be included in these representations. The review on 0D models collects together in one place a description of the range of models that have been used to describe the various characteristics of cardiovascular response, together with the factors that influence it. Such models generally feature the major components of the system, such as the heart, the heart valves and the vasculature. The models are categorised in terms of the features of the system that they are able to represent, their complexity and range of application: representations of effects including pressure-dependent vessel properties, interaction between the heart chambers, neuro-regulation and auto-regulation are explored. The examination on 1D models covers various methods for the assembly, discretisation and solution of the governing equations, in conjunction with a report of the definition and treatment of boundary conditions. Increasingly, 0D and 1D models are used in multi-scale models, in which their primary role is to provide boundary conditions for sophisticate, and often patient-specific, 2D and 3D models, and this application is also addressed. As an example of 0D cardiovascular modelling, a small selection of simple models have been represented in the CellML mark-up language and uploaded to the CellML model repository http://models.cellml.org/. They are freely available to the research and education communities. Conclusion Each published cardiovascular model has merit for particular applications. This review categorises 0D and 1D models, highlights their advantages and disadvantages, and thus provides guidance on the selection of models to assist various cardiovascular modelling studies. It also identifies directions for further development, as well as current challenges in the wider use of these models including service to represent boundary conditions for local 3D models and translation to clinical application. PMID:21521508
Progressive Damage Modeling of Durable Bonded Joint Technology
NASA Technical Reports Server (NTRS)
Leone, Frank A.; Davila, Carlos G.; Lin, Shih-Yung; Smeltzer, Stan; Girolamo, Donato; Ghose, Sayata; Guzman, Juan C.; McCarville, Duglas A.
2013-01-01
The development of durable bonded joint technology for assembling composite structures for launch vehicles is being pursued for the U.S. Space Launch System. The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology applicable to a wide range of sandwich structures for a Heavy Lift Launch Vehicle. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented Durable Redundant Joint. Both designs involve a honeycomb sandwich with carbon/epoxy facesheets joined with adhesively bonded doublers. Progressive damage modeling allows for the prediction of the initiation and evolution of damage. For structures that include multiple materials, the number of potential failure mechanisms that must be considered increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The joints were modeled using Abaqus parametric finite element models, in which damage was modeled with user-written subroutines. Each ply was meshed discretely, and layers of cohesive elements were used to account for delaminations and to model the adhesive layers. Good correlation with experimental results was achieved both in terms of load-displacement history and predicted failure mechanisms.
Metabolomics in Small Fish Toxicology: Assessing the Impacts of Model EDCs
Although lagging behind applications targeted to human endpoints, metabolomics offers great potential in environmental applications, including ecotoxicology. Indeed, the advantages of metabolomics (relative to other ‘omic techniques) may be more tangible in ecotoxicology because...
ERIC Educational Resources Information Center
Cocking, Rodney R.; Mestre, Jose P.
The focus of this paper is on cognitive science as a model for understanding the application of human skills toward effective problem-solving. Sections include: (1) "Introduction" (discussing information processing framework, expert-novice distinctions, schema theory, and learning process); (2) "Application: The Expert-Novice…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
... principles, specifically the Instructional Theory Into Practice (ITIP) model. The applicant team must include... commitment to work within the proposed budget. In addition to the narrative and attachments, the applicant...
Strawman payload data for science and applications space platforms
NASA Technical Reports Server (NTRS)
1980-01-01
The need for a free flying science and applications space platform to host compatible long duration experiment groupings in Earth orbit is discussed. Experiment level information on strawman payload models is presented which serves to identify and quantify the requirements for the space platform system. A description data base on the strawman payload model is presented along with experiment level and group level summaries. Payloads identified in the strawman model include the disciplines of resources observations and environmental observations.
NASA Astrophysics Data System (ADS)
Ogawa, Tatsuhiko; Hashimoto, Shintaro; Sato, Tatsuhiko; Niita, Koji
2014-06-01
A new nuclear de-excitation model, intended for accurate simulation of isomeric transition of excited nuclei, was incorporated into PHITS and applied to various situations to clarify the impact of the model. The case studies show that precise treatment of gamma de-excitation and consideration for isomer production are important for various applications such as detector performance prediction, radiation shielding calculations and the estimation of radioactive inventory including isomers.
2000-06-20
smoothing and regression which includes curve fitting are two principle forecasting model types utilized in the vast majority of forecasting applications ... model were compared against the VA Office of Policy and Planning forecasting study commissioned with the actuarial firm of Milliman & Robertson (M & R... Application to the Veterans Healthcare System The development of a model to forecast future VEV needs, utilization, and cost of the Acute Care and
On unified modeling, theory, and method for solving multi-scale global optimization problems
NASA Astrophysics Data System (ADS)
Gao, David Yang
2016-10-01
A unified model is proposed for general optimization problems in multi-scale complex systems. Based on this model and necessary assumptions in physics, the canonical duality theory is presented in a precise way to include traditional duality theories and popular methods as special applications. Two conjectures on NP-hardness are proposed, which should play important roles for correctly understanding and efficiently solving challenging real-world problems. Applications are illustrated for both nonconvex continuous optimization and mixed integer nonlinear programming.
1994-07-01
provide additional information for the user / policy analyst: Eichers, D., Sola, M., McLernan, G., EPICC User’s Manual , Systems Research and Applications...maintenance, and a set of on-line help screens. Each are further discussed below and a full discussion is included in the EPICC User’s Manual . Menu Based...written documentation (user’s manual ) that will be provided with the model. 55 The next chapter discusses the validation of the inventory projection and
McCay, Deborah French
2003-01-01
Natural resource damage assessment (NRDA) models for oil spills have been under development since 1984. Generally applicable (simplified) versions with built-in data sets are included in US government regulations for NRDAs in US waters. The most recent version of these models is SIMAP (Spill Impact Model Application Package), which contains oil fates and effects models that may be applied to any spill event and location in marine or freshwater environments. It is often not cost-effective or even possible to quantify spill impacts using field data collections. Modeling allows quantification of spill impacts using as much site-specific data as available, either as input or as validation of model results. SIMAP was used for the North Cape oil spill in Rhode Island (USA) in January 1996, for injury quantification in the first and largest NRDA case to be performed under the 1996 Oil Pollution Act NRDA regulations. The case was successfully settled in 1999. This paper, which contains a description of the model and application to the North Cape spill, delineates and demonstrates the approach.
Identification and stochastic control of helicopter dynamic modes
NASA Technical Reports Server (NTRS)
Molusis, J. A.; Bar-Shalom, Y.
1983-01-01
A general treatment of parameter identification and stochastic control for use on helicopter dynamic systems is presented. Rotor dynamic models, including specific applications to rotor blade flapping and the helicopter ground resonance problem are emphasized. Dynamic systems which are governed by periodic coefficients as well as constant coefficient models are addressed. The dynamic systems are modeled by linear state variable equations which are used in the identification and stochastic control formulation. The pure identification problem as well as the stochastic control problem which includes combined identification and control for dynamic systems is addressed. The stochastic control problem includes the effect of parameter uncertainty on the solution and the concept of learning and how this is affected by the control's duel effect. The identification formulation requires algorithms suitable for on line use and thus recursive identification algorithms are considered. The applications presented use the recursive extended kalman filter for parameter identification which has excellent convergence for systems without process noise.
Application for managing model-based material properties for simulation-based engineering
Hoffman, Edward L [Alameda, CA
2009-03-03
An application for generating a property set associated with a constitutive model of a material includes a first program module adapted to receive test data associated with the material and to extract loading conditions from the test data. A material model driver is adapted to receive the loading conditions and a property set and operable in response to the loading conditions and the property set to generate a model response for the material. A numerical optimization module is adapted to receive the test data and the model response and operable in response to the test data and the model response to generate the property set.
A discrete-element model for viscoelastic deformation and fracture of glacial ice
NASA Astrophysics Data System (ADS)
Riikilä, T. I.; Tallinen, T.; Åström, J.; Timonen, J.
2015-10-01
A discrete-element model was developed to study the behavior of viscoelastic materials that are allowed to fracture. Applicable to many materials, the main objective of this analysis was to develop a model specifically for ice dynamics. A realistic model of glacial ice must include elasticity, brittle fracture and slow viscous deformations. Here the model is described in detail and tested with several benchmark simulations. The model was used to simulate various ice-specific applications with resulting flow rates that were compatible with Glen's law, and produced under fragmentation fragment-size distributions that agreed with the known analytical and experimental results.
Thermophysical modelling for high-resolution digital terrain models
NASA Astrophysics Data System (ADS)
Pelivan, I.
2018-07-01
A method is presented for efficiently calculating surface temperatures for highly resolved celestial body shapes. A thorough investigation of the necessary conditions leading to reach model convergence shows that the speed of surface temperature convergence depends on factors such as the quality of initial boundary conditions, thermal inertia, illumination conditions, and resolution of the numerical depth grid. The optimization process to shorten the simulation time while increasing or maintaining the accuracy of model results includes the introduction of facet-specific boundary conditions such as pre-computed temperature estimates and pre-evaluated simulation times. The individual facet treatment also allows for assigning other facet-specific properties such as local thermal inertia. The approach outlined in this paper is particularly useful for very detailed digital terrain models in combination with unfavourable illumination conditions such as little-to-no sunlight at all for a period of time as experienced locally on comet 67P/Churyumov-Gerasimenko. Possible science applications include thermal analysis of highly resolved local (landing) sites experiencing seasonal, environment, and lander shadowing. In combination with an appropriate roughness model, the method is very suitable for application to disc-integrated and disc-resolved data. Further applications are seen where the complexity of the task has led to severe shape or thermophysical model simplifications such as in studying surface activity or thermal cracking.
Thermophysical modeling for high-resolution digital terrain models
NASA Astrophysics Data System (ADS)
Pelivan, I.
2018-04-01
A method is presented for efficiently calculating surface temperatures for highly resolved celestial body shapes. A thorough investigation of the necessary conditions leading to reach model convergence shows that the speed of surface temperature convergence depends on factors such as the quality of initial boundary conditions, thermal inertia, illumination conditions, and resolution of the numerical depth grid. The optimization process to shorten the simulation time while increasing or maintaining the accuracy of model results includes the introduction of facet-specific boundary conditions such as pre-computed temperature estimates and pre-evaluated simulation times. The individual facet treatment also allows for assigning other facet-specific properties such as local thermal inertia. The approach outlined in this paper is particularly useful for very detailed digital terrain models in combination with unfavorable illumination conditions such as little to no sunlight at all for a period of time as experienced locally on comet 67P/Churyumov-Gerasimenko. Possible science applications include thermal analysis of highly resolved local (landing) sites experiencing seasonal, environment and lander shadowing. In combination with an appropriate roughness model, the method is very suitable for application to disk-integrated and disk-resolved data. Further applications are seen where the complexity of the task has led to severe shape or thermophysical model simplifications such as in studying surface activity or thermal cracking.
Multi-Material ALE with AMR for Modeling Hot Plasmas and Cold Fragmenting Materials
NASA Astrophysics Data System (ADS)
Alice, Koniges; Nathan, Masters; Aaron, Fisher; David, Eder; Wangyi, Liu; Robert, Anderson; David, Benson; Andrea, Bertozzi
2015-02-01
We have developed a new 3D multi-physics multi-material code, ALE-AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR) to connect the continuum to the microstructural regimes. The code is unique in its ability to model hot radiating plasmas and cold fragmenting solids. New numerical techniques were developed for many of the physics packages to work efficiently on a dynamically moving and adapting mesh. We use interface reconstruction based on volume fractions of the material components within mixed zones and reconstruct interfaces as needed. This interface reconstruction model is also used for void coalescence and fragmentation. A flexible strength/failure framework allows for pluggable material models, which may require material history arrays to determine the level of accumulated damage or the evolving yield stress in J2 plasticity models. For some applications laser rays are propagating through a virtual composite mesh consisting of the finest resolution representation of the modeled space. A new 2nd order accurate diffusion solver has been implemented for the thermal conduction and radiation transport packages. One application area is the modeling of laser/target effects including debris/shrapnel generation. Other application areas include warm dense matter, EUV lithography, and material wall interactions for fusion devices.
Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System
2010-09-13
model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Current problems in applied mathematics and mathematical modeling
NASA Astrophysics Data System (ADS)
Alekseev, A. S.
Papers are presented on mathematical modeling noting applications to such fields as geophysics, chemistry, atmospheric optics, and immunology. Attention is also given to models of ocean current fluxes, atmospheric and marine interactions, and atmospheric pollution. The articles include studies of catalytic reactors, models of global climate phenomena, and computer-assisted atmospheric models.
Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk
2013-01-01
Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.
ERIC Educational Resources Information Center
Shin, Tacksoo
2012-01-01
This study introduced various nonlinear growth models, including the quadratic conventional polynomial model, the fractional polynomial model, the Sigmoid model, the growth model with negative exponential functions, the multidimensional scaling technique, and the unstructured growth curve model. It investigated which growth models effectively…
NASA Astrophysics Data System (ADS)
Carr, Michael J.; Gazel, Esteban
2017-04-01
We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.
RRAWFLOW: Rainfall-Response Aquifer and Watershed Flow Model (v1.11)
NASA Astrophysics Data System (ADS)
Long, A. J.
2014-09-01
The Rainfall-Response Aquifer and Watershed Flow Model (RRAWFLOW) is a lumped-parameter model that simulates streamflow, springflow, groundwater level, solute transport, or cave drip for a measurement point in response to a system input of precipitation, recharge, or solute injection. The RRAWFLOW open-source code is written in the R language and is included in the Supplement to this article along with an example model of springflow. RRAWFLOW includes a time-series process to estimate recharge from precipitation and simulates the response to recharge by convolution; i.e., the unit hydrograph approach. Gamma functions are used for estimation of parametric impulse-response functions (IRFs); a combination of two gamma functions results in a double-peaked IRF. A spline fit to a set of control points is introduced as a new method for estimation of nonparametric IRFs. Other options include the use of user-defined IRFs and different methods to simulate time-variant systems. For many applications, lumped models simulate the system response with equal accuracy to that of distributed models, but moreover, the ease of model construction and calibration of lumped models makes them a good choice for many applications. RRAWFLOW provides professional hydrologists and students with an accessible and versatile tool for lumped-parameter modeling.
40 CFR 1037.225 - Amending applications for certification.
Code of Federal Regulations, 2014 CFR
2014-07-01
...-data vehicle or emission modeling for the vehicle family is not appropriate to show compliance for the new or modified vehicle configuration, include new test data or emission modeling showing that the new...
40 CFR 1037.225 - Amending applications for certification.
Code of Federal Regulations, 2012 CFR
2012-07-01
...-data vehicle or emission modeling for the vehicle family is not appropriate to show compliance for the new or modified vehicle configuration, include new test data or emission modeling showing that the new...
40 CFR 1037.225 - Amending applications for certification.
Code of Federal Regulations, 2013 CFR
2013-07-01
...-data vehicle or emission modeling for the vehicle family is not appropriate to show compliance for the new or modified vehicle configuration, include new test data or emission modeling showing that the new...
NASA Technical Reports Server (NTRS)
Marr, W. A., Jr.
1972-01-01
The behavior of finite element models employing different constitutive relations to describe the stress-strain behavior of soils is investigated. Three models, which assume small strain theory is applicable, include a nondilatant, a dilatant and a strain hardening constitutive relation. Two models are formulated using large strain theory and include a hyperbolic and a Tresca elastic perfectly plastic constitutive relation. These finite element models are used to analyze retaining walls and footings. Methods of improving the finite element solutions are investigated. For nonlinear problems better solutions can be obtained by using smaller load increment sizes and more iterations per load increment than by increasing the number of elements. Suitable methods of treating tension stresses and stresses which exceed the yield criteria are discussed.
Cognitive engineering models in space systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1992-01-01
NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. The human operator monitors and fine-tunes computer-based control systems and is responsible for ensuring safe and efficient system operation. In such systems, the potential consequences of human mistakes and errors may be very large, and low probability of such events is likely. Thus, models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision support aids. The operator function model represents normative operator behavior-expected operator activities given current system state. The extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications is discussed.
A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options
ERIC Educational Resources Information Center
de la Torre, Jimmy
2009-01-01
Cognitive or skills diagnosis models are discrete latent variable models developed specifically for the purpose of identifying the presence or absence of multiple fine-grained skills. However, applications of these models typically involve dichotomous or dichotomized data, including data from multiple-choice (MC) assessments that are scored as…
Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods
ERIC Educational Resources Information Center
Soroush, Masoud; Weinberger, Charles B.
2010-01-01
This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…
Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie
2010-10-01
The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters. © 2010 SETAC.
Probabilistic application of a fugacity model to predict triclosan fate during wastewater treatment.
Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie
2010-07-01
The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters. (c) 2010 SETAC.
Finite element modelling of the foot for clinical application: A systematic review.
Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan
2017-01-01
Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
AVAILABLE MICRO-ACTIVITY DATA AND THEIR APPLICABILITY TO AGGREGATE EXPOSURE MODELING
Several human exposure models have been developed in recent years to address children's aggregate and cumulative exposures to pesticides under the Food Quality Protection Act of 1996. These models estimate children's exposures via all significant routes and pathways including ...
Venus Global Reference Atmospheric Model
NASA Technical Reports Server (NTRS)
Justh, Hilary L.
2017-01-01
Venus Global Reference Atmospheric Model (Venus-GRAM) is an engineering-level atmospheric model developed by MSFC that is widely used for diverse mission applications including: Systems design; Performance analysis; Operations planning for aerobraking, Entry, Descent and Landing, and aerocapture; Is not a forecast model; Outputs include density, temperature, pressure, wind components, and chemical composition; Provides dispersions of thermodynamic parameters, winds, and density; Optional trajectory and auxiliary profile input files Has been used in multiple studies and proposals including NASA Engineering and Safety Center (NESC) Autonomous Aerobraking and various Discovery proposals; Released in 2005; Available at: https://software.nasa.gov/software/MFS-32314-1.
A constitutive model for the forces of a magnetic bearing including eddy currents
NASA Technical Reports Server (NTRS)
Taylor, D. L.; Hebbale, K. V.
1993-01-01
A multiple magnet bearing can be developed from N individual electromagnets. The constitutive relationships for a single magnet in such a bearing is presented. Analytical expressions are developed for a magnet with poles arranged circumferencially. Maxwell's field equations are used so the model easily includes the effects of induced eddy currents due to the rotation of the journal. Eddy currents must be included in any dynamic model because they are the only speed dependent parameter and may lead to a critical speed for the bearing. The model is applicable to bearings using attraction or repulsion.
Reliability of IGBT in a STATCOM for Harmonic Compensation and Power Factor Correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopi Reddy, Lakshmi Reddy; Tolbert, Leon M; Ozpineci, Burak
With smart grid integration, there is a need to characterize reliability of a power system by including reliability of power semiconductors in grid related applications. In this paper, the reliability of IGBTs in a STATCOM application is presented for two different applications, power factor correction and harmonic elimination. The STATCOM model is developed in EMTP, and analytical equations for average conduction losses in an IGBT and a diode are derived and compared with experimental data. A commonly used reliability model is used to predict reliability of IGBT.
Enabling Real-time Water Decision Support Services Using Model as a Service
NASA Astrophysics Data System (ADS)
Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.
2014-12-01
Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.
NASA Astrophysics Data System (ADS)
gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh
2014-05-01
The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.
Application distribution model and related security attacks in VANET
NASA Astrophysics Data System (ADS)
Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian
2013-03-01
In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.
Code of Federal Regulations, 2011 CFR
2011-01-01
... lien accommodation or subordination and including the amount and maturity of the proposed loan, a... provisions of applicable model codes for any buildings to be constructed, as required by 7 CFR 1792.104; and...
Modeling and optimization of energy storage system for microgrid
NASA Astrophysics Data System (ADS)
Qiu, Xin
The vanadium redox flow battery (VRB) is well suited for the applications of microgrid and renewable energy. This thesis will have a practical analysis of the battery itself and its application in microgrid systems. The first paper analyzes the VRB use in a microgrid system. The first part of the paper develops a reduced order circuit model of the VRB and analyzes its experimental performance efficiency during deployment. The statistical methods and neural network approximation are used to estimate the system parameters. The second part of the paper addresses the implementation issues of the VRB application in a photovoltaic-based microgrid system. A new dc-dc converter was proposed to provide improved charging performance. The paper was published on IEEE Transactions on Smart Grid, Vol. 5, No. 4, July 2014. The second paper studies VRB use within a microgrid system from a practical perspective. A reduced order circuit model of the VRB is introduced that includes the losses from the balance of plant including system and environmental controls. The proposed model includes the circulation pumps and the HVAC system that regulates the environment of the VRB enclosure. In this paper, the VRB model is extended to include the ESS environmental controls to provide a model that provides a more realistic efficiency profile. The paper was submitted to IEEE Transactions on Sustainable Energy. Third paper discussed the optimal control strategy when VRB works with other type of battery in a microgird system. The work in first paper is extended. A high level control strategy is developed to coordinate a lead acid battery and a VRB with reinforcement learning. The paper is to be submitted to IEEE Transactions on Smart Grid.
Modeling and simulation of evacuation behavior using fuzzy logic in a goal finding application
NASA Astrophysics Data System (ADS)
Sharma, Sharad; Ogunlana, Kola; Sree, Swetha
2016-05-01
Modeling and simulation has been widely used as a training and educational tool for depicting different evacuation strategies and damage control decisions during evacuation. However, there are few simulation environments that can include human behavior with low to high levels of fidelity. It is well known that crowd stampede induced by panic leads to fatalities as people are crushed or trampled. Our proposed goal finding application can be used to model situations that are difficult to test in real-life due to safety considerations. It is able to include agent characteristics and behaviors. Findings of this model are very encouraging as agents are able to assume various roles to utilize fuzzy logic on the way to reaching their goals. Fuzzy logic is used to model stress, panic and the uncertainty of emotions. The fuzzy rules link these parts together while feeding into behavioral rules. The contributions of this paper lies in our approach of utilizing fuzzy logic to show learning and adaptive behavior of agents in a goal finding application. The proposed application will aid in running multiple evacuation drills for what-if scenarios by incorporating human behavioral characteristics that can scale from a room to building. Our results show that the inclusion of fuzzy attributes made the evacuation time of the agents closer to the real time drills.
Reuse: A knowledge-based approach
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui
1992-01-01
This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
MODFLOW-LGR: Practical application to a large regional dataset
NASA Astrophysics Data System (ADS)
Barnes, D.; Coulibaly, K. M.
2011-12-01
In many areas of the US, including southwest Florida, large regional-scale groundwater models have been developed to aid in decision making and water resources management. These models are subsequently used as a basis for site-specific investigations. Because the large scale of these regional models is not appropriate for local application, refinement is necessary to analyze the local effects of pumping wells and groundwater related projects at specific sites. The most commonly used approach to date is Telescopic Mesh Refinement or TMR. It allows the extraction of a subset of the large regional model with boundary conditions derived from the regional model results. The extracted model is then updated and refined for local use using a variable sized grid focused on the area of interest. MODFLOW-LGR, local grid refinement, is an alternative approach which allows model discretization at a finer resolution in areas of interest and provides coupling between the larger "parent" model and the locally refined "child." In the present work, these two approaches are tested on a mining impact assessment case in southwest Florida using a large regional dataset (The Lower West Coast Surficial Aquifer System Model). Various metrics for performance are considered. They include: computation time, water balance (as compared to the variable sized grid), calibration, implementation effort, and application advantages and limitations. The results indicate that MODFLOW-LGR is a useful tool to improve local resolution of regional scale models. While performance metrics, such as computation time, are case-dependent (model size, refinement level, stresses involved), implementation effort, particularly when regional models of suitable scale are available, can be minimized. The creation of multiple child models within a larger scale parent model makes it possible to reuse the same calibrated regional dataset with minimal modification. In cases similar to the Lower West Coast model, where a model is larger than optimal for direct application as a parent grid, a combination of TMR and LGR approaches should be used to develop a suitable parent grid.
Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code
NASA Technical Reports Server (NTRS)
Freeh, Josh
2003-01-01
Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.
Analysis of Brown camera distortion model
NASA Astrophysics Data System (ADS)
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
Geographic information system/watershed model interface
Fisher, Gary T.
1989-01-01
Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.
Comparison and validation of point spread models for imaging in natural waters.
Hou, Weilin; Gray, Deric J; Weidemann, Alan D; Arnone, Robert A
2008-06-23
It is known that scattering by particulates within natural waters is the main cause of the blur in underwater images. Underwater images can be better restored or enhanced with knowledge of the point spread function (PSF) of the water. This will extend the performance range as well as the information retrieval from underwater electro-optical systems, which is critical in many civilian and military applications, including target and especially mine detection, search and rescue, and diver visibility. A better understanding of the physical process involved also helps to predict system performance and simulate it accurately on demand. The presented effort first reviews several PSF models, including the introduction of a semi-analytical PSF given optical properties of the medium, including scattering albedo, mean scattering angles and the optical range. The models under comparison include the empirical model of Duntley, a modified PSF model by Dolin et al, as well as the numerical integration of analytical forms from Wells, as a benchmark of theoretical results. For experimental results, in addition to that of Duntley, we validate the above models with measured point spread functions by applying field measured scattering properties with Monte Carlo simulations. Results from these comparisons suggest it is sufficient but necessary to have the three parameters listed above to model PSFs. The simplified approach introduced also provides adequate accuracy and flexibility for imaging applications, as shown by examples of restored underwater images.
NASA AVOSS Fast-Time Models for Aircraft Wake Prediction: User's Guide (APA3.8 and TDP2.1)
NASA Technical Reports Server (NTRS)
Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew J.; Limon Duparcmeur, Fanny M.
2016-01-01
NASA's current distribution of fast-time wake vortex decay and transport models includes APA (Version 3.8) and TDP (Version 2.1). This User's Guide provides detailed information on the model inputs, file formats, and model outputs. A brief description of the Memphis 1995, Dallas/Fort Worth 1997, and the Denver 2003 wake vortex datasets is given along with the evaluation of models. A detailed bibliography is provided which includes publications on model development, wake field experiment descriptions, and applications of the fast-time wake vortex models.
Communications Processors: Categories, Applications, and Trends
1976-03-01
allow switching from BSC to SDLC .(12) Standard protocols would ease the requirement that communications processor software convert from one...COMMANDER c^/g^_ (^-»M-^ V »*-^ FRANK J. EMMA, Colonel, USAF Director, information Systems Technology Applications Office Deputy for Command...guidelines in selecting a device for a specific application are included, with manufacturer models presented as illustrations. UNCLASSIFIED SECURITY
Space physiology IV: mathematical modeling of the cardiovascular system in space exploration.
Keith Sharp, M; Batzel, Jerry Joseph; Montani, Jean-Pierre
2013-08-01
Mathematical modeling represents an important tool for analyzing cardiovascular function during spaceflight. This review describes how modeling of the cardiovascular system can contribute to space life science research and illustrates this process via modeling efforts to study postflight orthostatic intolerance (POI), a key issue for spaceflight. Examining this application also provides a context for considering broader applications of modeling techniques to the challenges of bioastronautics. POI, which affects a large fraction of astronauts in stand tests upon return to Earth, presents as dizziness, fainting and other symptoms, which can diminish crew performance and cause safety hazards. POI on the Moon or Mars could be more critical. In the field of bioastronautics, POI has been the dominant application of cardiovascular modeling for more than a decade, and a number of mechanisms for POI have been investigated. Modeling approaches include computational models with a range of incorporated factors and hemodynamic sophistication, and also physical models tested in parabolic and orbital flight. Mathematical methods such as parameter sensitivity analysis can help identify key system mechanisms. In the case of POI, this could lead to more effective countermeasures. Validation is a persistent issue in modeling efforts, and key considerations and needs for experimental data to synergistically improve understanding of cardiovascular responses are outlined. Future directions in cardiovascular modeling include subject-specific assessment of system status, as well as research on integrated physiological responses, leading, for instance, to assessment of subject-specific susceptibility to POI or effects of cardiovascular alterations on muscular, vision and cognitive function.
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
Specialty fibers for fiber optic sensor application
NASA Astrophysics Data System (ADS)
Bennett, K.; Koh, J.; Coon, J.; Chien, C. K.; Artuso, A.; Chen, X.; Nolan, D.; Li, M.-J.
2007-09-01
Over the last several years, Fiber Optic Sensor (FOS) applications have seen an increased acceptance in many areas including oil & gas production monitoring, gyroscopes, current sensors, structural sensing and monitoring, and aerospace applications. High level optical and mechanical reliability of optical fiber is necessary to guarantee reliable performance of FOS. In this paper, we review recent research and development activities on new specialty fibers. We discuss fiber design concepts and present both modeling and experimental results. The main approaches to enhancing fiber attributes include new index profile design and fiber coating modification.
Forest growth modeling and prediction (Volumes 1 & 2).
Alan R. Ek; Stephen R. Shifley; Thomas E. Burk
1988-01-01
Proceedings of the August 23-27 IUFRO Conference, Minneapolis, Minnesota. Includes 143 manuscripts dealing with growth and yield modeling; regeneration; site characterization; effects of fertilization, genetics, and disturbance; density management; evaluation; estimation; inventory; and application.
Functional Behavioral Assessment: A School Based Model.
ERIC Educational Resources Information Center
Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.
2002-01-01
This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…
New simulation model of multicomponent crystal growth and inhibition.
Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao
2004-04-02
We review a novel computational model for the study of crystal structures both on their own and in conjunction with inhibitor molecules. The model advances existing Monte Carlo (MC) simulation techniques by extending them from modeling 3D crystal surface patches to modeling entire 3D crystals, and by including the use of "complex" multicomponent molecules within the simulations. These advances makes it possible to incorporate the 3D shape and non-uniform surface properties of inhibitors into simulations, and to study what effect these inhibitor properties have on the growth of whole crystals containing up to tens of millions of molecules. The application of this extended MC model to the study of antifreeze proteins (AFPs) and their effects on ice formation is reported, including the success of the technique in achieving AFP-induced ice-growth inhibition with concurrent changes to ice morphology that mimic experimental results. Simulations of ice-growth inhibition suggest that the degree of inhibition afforded by an AFP is a function of its ice-binding position relative to the underlying anisotropic growth pattern of ice. This extended MC technique is applicable to other crystal and crystal-inhibitor systems, including more complex crystal systems such as clathrates.
Model for Predicting Passage of Invasive Fish Species Through Culverts
NASA Astrophysics Data System (ADS)
Neary, V.
2010-12-01
Conservation efforts to promote or inhibit fish passage include the application of simple fish passage models to determine whether an open channel flow allows passage of a given fish species. Derivations of simple fish passage models for uniform and nonuniform flow conditions are presented. For uniform flow conditions, a model equation is developed that predicts the mean-current velocity threshold in a fishway, or velocity barrier, which causes exhaustion at a given maximum distance of ascent. The derivation of a simple expression for this exhaustion-threshold (ET) passage model is presented using kinematic principles coupled with fatigue curves for threatened and endangered fish species. Mean current velocities at or above the threshold predict failure to pass. Mean current velocities below the threshold predict successful passage. The model is therefore intuitive and easily applied to predict passage or exclusion. The ET model’s simplicity comes with limitations, however, including its application only to uniform flow, which is rarely found in the field. This limitation is addressed by deriving a model that accounts for nonuniform conditions, including backwater profiles and drawdown curves. Comparison of these models with experimental data from volitional swimming studies of fish indicates reasonable performance, but limitations are still present due to the difficulty in predicting fish behavior and passage strategies that can vary among individuals and different fish species.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less
Brewer, Shannon K.; Worthington, Thomas; Mollenhauer, Robert; Stewart, David; McManamay, Ryan; Guertault, Lucie; Moore, Desiree
2018-01-01
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio‐economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models, 43 were commonly applied due to their versatility, accessibility, user‐friendliness, and excellent user‐support. Forty‐one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user‐support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user‐friendly forms, increasing user‐support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Nonetheless, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.
Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert; ...
2018-04-06
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less
Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application
NASA Astrophysics Data System (ADS)
Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei
2016-04-01
In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.
Ghaedi, A M; Ghaedi, M; Karami, P
2015-03-05
The present work focused on the removal of sunset yellow (SY) dye from aqueous solution by ultrasound-assisted adsorption and stirrer by activated carbon prepared from wood of an orange tree. Also, the artificial neural network (ANN) model was used for predicting removal (%) of SY dye based on experimental data. In this study a green approach was described for the synthesis of activated carbon prepared from wood of an orange tree and usability of it for the removal of sunset yellow. This material was characterized using scanning electron microscopy (SEM) and transmission electron microscopy (TEM). The impact of variables, including initial dye concentration (mg/L), pH, adsorbent dosage (g), sonication time (min) and temperature (°C) on SY removal were studied. Fitting the experimental equilibrium data of different isotherm models such as Langmuir, Freundlich, Temkin and Dubinin-Radushkevich models display the suitability and applicability of the Langmuir model. Analysis of experimental adsorption data by different kinetic models including pseudo-first and second order, Elovich and intraparticle diffusion models indicate the applicability of the second-order equation model. The adsorbent (0.5g) is applicable for successful removal of SY (>98%) in short time (10min) under ultrasound condition. Copyright © 2014 Elsevier B.V. All rights reserved.
A new, but old business model for family physicians: cash.
Weber, J Michael
2013-01-01
The following study is an exploratory investigation into the opportunity identification, opportunity analysis, and strategic implications of implementing a cash-only family physician practice. The current market dynamics (i.e., increasing insurance premiums, decreasing benefits, more regulations and paperwork, and cuts in federal and state programs) suggest that there is sufficient motivation for these practitioners to change their current business model. In-depth interviews were conducted with office managers and physicians of family physician practices. The results highlighted a variety of issues, including barriers to change, strategy issues, and opportunities/benefits. The implications include theory applications, strategic marketing applications, and managerial decision-making.
Evaluating MJO Event Initiation and Decay in the Skeleton Model using an RMM-like Index
2015-11-25
climatology and document 35 the occurrence of primary, continuing, and terminating MJO events in the skeleton model. The 36 overall amount of MJO...solutions in a framework consistent with observations including MJO event 104 climatology and the precursor conditions associated with the initiation and...the 112 7 model along with several applications that include a comparison to the observed MJO event 113 climatology and identification of
Systems modeling and simulation applications for critical care medicine
2012-01-01
Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
A new service-oriented grid-based method for AIoT application and implementation
NASA Astrophysics Data System (ADS)
Zou, Yiqin; Quan, Li
2017-07-01
The traditional three-layer Internet of things (IoT) model, which includes physical perception layer, information transferring layer and service application layer, cannot express complexity and diversity in agricultural engineering area completely. It is hard to categorize, organize and manage the agricultural things with these three layers. Based on the above requirements, we propose a new service-oriented grid-based method to set up and build the agricultural IoT. Considering the heterogeneous, limitation, transparency and leveling attributes of agricultural things, we propose an abstract model for all agricultural resources. This model is service-oriented and expressed with Open Grid Services Architecture (OGSA). Information and data of agricultural things were described and encapsulated by using XML in this model. Every agricultural engineering application will provide service by enabling one application node in this service-oriented grid. Description of Web Service Resource Framework (WSRF)-based Agricultural Internet of Things (AIoT) and the encapsulation method were also discussed in this paper for resource management in this model.
21st International Conference on DNA Computing and Molecular Programming: 8.1 Biochemistry
include information storage and biological applications of DNA systems, biomolecular chemical reaction networks, applications of self -assembled DNA...nanostructures, tile self -assembly and computation, principles and models of self -assembly, and strand displacement and biomolecular circuits. The fund
NASA Technical Reports Server (NTRS)
Goorevich, C. E.
1975-01-01
The mathematical formulation is presented of CNTRLF, the maneuver control program for the Applications Technology Satellite-F (ATS-F). The purpose is to specify the mathematical models that are included in the design of CNTRLF.
A toy terrestrial carbon flow model
NASA Technical Reports Server (NTRS)
Parton, William J.; Running, Steven W.; Walker, Brian
1992-01-01
A generalized carbon flow model for the major terrestrial ecosystems of the world is reported. The model is a simplification of the Century model and the Forest-Biogeochemical model. Topics covered include plant production, decomposition and nutrient cycling, biomes, the utility of the carbon flow model for predicting carbon dynamics under global change, and possible applications to state-and-transition models and environmentally driven global vegetation models.
NASA Astrophysics Data System (ADS)
Cho, Dong-Woo; Lee, Jung-Seob; Jang, Jinah; Jung, Jin Woo; Park, Jeong Hun; Pati, Falguni
2015-10-01
This book introduces various 3D printing systems, biomaterials, and cells for organ printing. In view of the latest applications of several 3D printing systems, their advantages and disadvantages are also discussed. A basic understanding of the entire spectrum of organ printing provides pragmatic insight into the mechanisms, methods, and applications of this discipline. Organ printing is being applied in the tissue engineering field with the purpose of developing tissue/organ constructs for the regeneration of both hard (bone, cartilage, osteochondral) and soft tissues (heart). There are other potential application areas including tissue/organ models, disease/cancer models, and models for physiology and pathology, where in vitro 3D multicellular structures developed by organ printing are valuable.
CAD-CAM database management at Bendix Kansas City
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.R.
1985-05-01
The Bendix Kansas City Division of Allied Corporation began integrating mechanical CAD-CAM capabilities into its operations in June 1980. The primary capabilities include a wireframe modeling application, a solid modeling application, and the Bendix Integrated Computer Aided Manufacturing (BICAM) System application, a set of software programs and procedures which provides user-friendly access to graphic applications and data, and user-friendly sharing of data between applications and users. BICAM also provides for enforcement of corporate/enterprise policies. Three access categories, private, local, and global, are realized through the implementation of data-management metaphors: the desk, reading rack, file cabinet, and library are for themore » storage, retrieval, and sharing of drawings and models. Access is provided through menu selections; searching for designs is done by a paging method or a search-by-attribute-value method. The sharing of designs between all users of Part Data is key. The BICAM System supports 375 unique users per quarter and manages over 7500 drawings and models. The BICAM System demonstrates the need for generalized models, a high-level system framework, prototyping, information-modeling methods, and an understanding of the entire enterprise. Future BICAM System implementations are planned to take advantage of this knowledge.« less
A dynamical systems model for nuclear power plant risk
NASA Astrophysics Data System (ADS)
Hess, Stephen Michael
The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of these models. Details of the development of the mathematical risk model are presented. This includes discussion of the processes included in the model and the identification of significant interprocess interactions. This is followed by analysis of the model that demonstrates that its dynamical evolution displays characteristics that have been observed at commercially operating plants. The model is analyzed using the previously described techniques from dynamical systems theory. From this analysis, several significant insights are obtained with respect to the effective control of nuclear safety risk. Finally, we present conclusions and recommendations for further research.
A Lake Michigan Ecosystem Model (LM-Eco) that includes a detailed description of trophic levels and their interactions was developed for Lake Michigan. The LM-Eco model constitutes a first step toward a comprehensive Lake Michigan ecosystem productivity model to investigate ecosy...
A Lake Michigan Ecosystem Model (LM-Eco) that includes a detailed description of trophic levels and their interactions was developed for Lake Michigan. The LM-Eco model constitutes a first step toward a comprehensive Lake Michigan ecosystem productivity model to investigate ecos...
Models for estimation and simulation of crown and canopy cover
John D. Shaw
2005-01-01
Crown width measurements collected during Forest Inventory and Analysis and Forest Health Monitoring surveys are being used to develop individual tree crown width models and plot-level canopy cover models for species and forest types in the Intermountain West. Several model applications are considered in the development process, including remote sensing of plot...
The Models-3 Community Multi-scale Air Quality (CMAQ) model, first released by the USEPA in 1999 (Byun and Ching. 1999), continues to be developed and evaluated. The principal components of the CMAQ system include a comprehensive emission processor known as the Sparse Matrix O...
Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over...
New optical and radio frequency angular tropospheric refraction models for deep space applications
NASA Technical Reports Server (NTRS)
Berman, A. L.; Rockwell, S. T.
1976-01-01
The development of angular tropospheric refraction models for optical and radio frequency usage is presented. The models are compact analytic functions, finite over the entire domain of elevation angle, and accurate over large ranges of pressure, temperature, and relative humidity. Additionally, FORTRAN subroutines for each of the models are included.
Currie, Danielle J; Smith, Carl; Jagals, Paul
2018-03-27
Policy and decision-making processes are routinely challenged by the complex and dynamic nature of environmental health problems. System dynamics modelling has demonstrated considerable value across a number of different fields to help decision-makers understand and predict the dynamic behaviour of complex systems in support the development of effective policy actions. In this scoping review we investigate if, and in what contexts, system dynamics modelling is being used to inform policy or decision-making processes related to environmental health. Four electronic databases and the grey literature were systematically searched to identify studies that intersect the areas environmental health, system dynamics modelling, and decision-making. Studies identified in the initial screening were further screened for their contextual, methodological and application-related relevancy. Studies deemed 'relevant' or 'highly relevant' according to all three criteria were included in this review. Key themes related to the rationale, impact and limitation of using system dynamics in the context of environmental health decision-making and policy were analysed. We identified a limited number of relevant studies (n = 15), two-thirds of which were conducted between 2011 and 2016. The majority of applications occurred in non-health related sectors (n = 9) including transportation, public utilities, water, housing, food, agriculture, and urban and regional planning. Applications were primarily targeted at micro-level (local, community or grassroots) decision-making processes (n = 9), with macro-level (national or international) decision-making to a lesser degree. There was significant heterogeneity in the stated rationales for using system dynamics and the intended impact of the system dynamics model on decision-making processes. A series of user-related, technical and application-related limitations and challenges were identified. None of the reported limitations or challenges appeared unique to the application of system dynamics within the context of environmental health problems, but rather to the use of system dynamics in general. This review reveals that while system dynamics modelling is increasingly being used to inform decision-making related to environmental health, applications are currently limited. Greater application of system dynamics within this context is needed before its benefits and limitations can be fully understood.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., including vehicle simulations using industry standard model (need to add name and location of this open.... All such information and data must include assumptions made in their preparation and the range of... any product (vehicle or component) to be produced by or through the project, including relevant data...
The NASA Lightning Nitrogen Oxides Model (LNOM): Recent Updates and Applications
NASA Technical Reports Server (NTRS)
Koshak, William; Peterson, Harold; Biazar, Arastoo; Khan, Maudood; Wang, Lihua; Park, Yee-Hun
2011-01-01
Improvements to the NASA Marshall Space Flight Center Lightning Nitrogen Oxides Model (LNOM) and its application to the Community Multiscale Air Quality (CMAQ) modeling system are presented. The LNOM analyzes Lightning Mapping Array (LMA) and National Lightning Detection Network(tm) (NLDN) data to estimate the raw (i.e., unmixed and otherwise environmentally unmodified) vertical profile of lightning NOx (= NO + NO2). Lightning channel length distributions and lightning 10-m segment altitude distributions are also provided. In addition to NOx production from lightning return strokes, the LNOM now includes non-return stroke lightning NOx production due to: hot core stepped and dart leaders, stepped leader corona sheath, K-changes, continuing currents, and M-components. The impact of including LNOM-estimates of lightning NOx for an August 2006 run of CMAQ is discussed.
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
Tristan code and its application
NASA Astrophysics Data System (ADS)
Nishikawa, K.-I.
Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Applicability of common stomatal conductance models in maize under varying soil moisture conditions.
Wang, Qiuling; He, Qijin; Zhou, Guangsheng
2018-07-01
In the context of climate warming, the varying soil moisture caused by precipitation pattern change will affect the applicability of stomatal conductance models, thereby affecting the simulation accuracy of carbon-nitrogen-water cycles in ecosystems. We studied the applicability of four common stomatal conductance models including Jarvis, Ball-Woodrow-Berry (BWB), Ball-Berry-Leuning (BBL) and unified stomatal optimization (USO) models based on summer maize leaf gas exchange data from a soil moisture consecutive decrease manipulation experiment. The results showed that the USO model performed best, followed by the BBL model, BWB model, and the Jarvis model performed worst under varying soil moisture conditions. The effects of soil moisture made a difference in the relative performance among the models. By introducing a water response function, the performance of the Jarvis, BWB, and USO models improved, which decreased the normalized root mean square error (NRMSE) by 15.7%, 16.6% and 3.9%, respectively; however, the performance of the BBL model was negative, which increased the NRMSE by 5.3%. It was observed that the models of Jarvis, BWB, BBL and USO were applicable within different ranges of soil relative water content (i.e., 55%-65%, 56%-67%, 37%-79% and 37%-95%, respectively) based on the 95% confidence limits. Moreover, introducing a water response function, the applicability of the Jarvis and BWB models improved. The USO model performed best with or without introducing the water response function and was applicable under varying soil moisture conditions. Our results provide a basis for selecting appropriate stomatal conductance models under drought conditions. Copyright © 2018 Elsevier B.V. All rights reserved.
A six-parameter Iwan model and its application
NASA Astrophysics Data System (ADS)
Li, Yikun; Hao, Zhiming
2016-02-01
Iwan model is a practical tool to describe the constitutive behaviors of joints. In this paper, a six-parameter Iwan model based on a truncated power-law distribution with two Dirac delta functions is proposed, which gives a more comprehensive description of joints than the previous Iwan models. Its analytical expressions including backbone curve, unloading curves and energy dissipation are deduced. Parameter identification procedures and the discretization method are also provided. A model application based on Segalman et al.'s experiment works with bolted joints is carried out. Simulation effects of different numbers of Jenkins elements are discussed. The results indicate that the six-parameter Iwan model can be used to accurately reproduce the experimental phenomena of joints.
THE LAKE MICHIGAN MASS BALANCE PROJECT: QUALITY ASSURANCE PLAN FOR MATHEMATICAL MODELLING
This report documents the quality assurance process for the development and application of the Lake Michigan Mass Balance Models. The scope includes the overall modeling framework as well as the specific submodels that are linked to form a comprehensive synthesis of physical, che...
Applicability of linear regression equation for prediction of chlorophyll content in rice leaves
NASA Astrophysics Data System (ADS)
Li, Yunmei
2005-09-01
A modeling approach is used to assess the applicability of the derived equations which are capable to predict chlorophyll content of rice leaves at a given view direction. Two radiative transfer models, including PROSPECT model operated at leaf level and FCR model operated at canopy level, are used in the study. The study is consisted of three steps: (1) Simulation of bidirectional reflectance from canopy with different leaf chlorophyll contents, leaf-area-index (LAI) and under storey configurations; (2) Establishment of prediction relations of chlorophyll content by stepwise regression; and (3) Assessment of the applicability of these relations. The result shows that the accuracy of prediction is affected by different under storey configurations and, however, the accuracy tends to be greatly improved with increase of LAI.
Genome Editing and Its Applications in Model Organisms.
Ma, Dongyuan; Liu, Feng
2015-12-01
Technological advances are important for innovative biological research. Development of molecular tools for DNA manipulation, such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly-interspaced short palindromic repeat (CRISPR)/CRISPR-associated (Cas), has revolutionized genome editing. These approaches can be used to develop potential therapeutic strategies to effectively treat heritable diseases. In the last few years, substantial progress has been made in CRISPR/Cas technology, including technical improvements and wide application in many model systems. This review describes recent advancements in genome editing with a particular focus on CRISPR/Cas, covering the underlying principles, technological optimization, and its application in zebrafish and other model organisms, disease modeling, and gene therapy used for personalized medicine. Copyright © 2016 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.
Enhanced Framework for Modeling Urban Truck Trips
DOT National Transportation Integrated Search
1998-09-16
Recently there has been renewed interest in modeling urban truck movements. : This is potentially important for improving traffic forecasts as well as for a : host of other applications including ITS. There are unique aspects of urban : freight movem...
Prediction of Chemical Function: Model Development and Application
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...
Surrogate Safety Assessment Model (SSAM)--software user manual
DOT National Transportation Integrated Search
2008-05-01
This document presents guidelines for the installation and use of the Surrogate Safety Assessment Model (SSAM) software. For more information regarding the SSAM application, including discussion of theoretical background and the results of a series o...
Proposal for a new CAPE-OPEN Object Model
Process simulation applications require the exchange of significant amounts of data between the flowsheet environment, unit operation model, and thermodynamic server. Packing and unpacking various data types and exchanging data using structured text-based architectures, including...
Modeling the Urban Boundary and Canopy Layers
Today, we are confronted with increasingly more sophisticated application requirements for urban modeling. These include those that address emergency response to acute exposures from toxic releases, health exposure assessments from adverse air quality, energy usage, and character...
Results of the GABLS3 diurnal-cycle benchmark for wind energy applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodrigo, J. Sanz; Allaerts, D.; Avila, M.
We present results of the GABLS3 model intercomparison benchmark revisited for wind energy applications. The case consists of a diurnal cycle, measured at the 200-m tall Cabauw tower in the Netherlands, including a nocturnal low-level jet. The benchmark includes a sensitivity analysis of WRF simulations using two input meteorological databases and five planetary boundary-layer schemes. A reference set of mesoscale tendencies is used to drive microscale simulations using RANS k-ϵ and LES turbulence models. The validation is based on rotor-based quantities of interest. Cycle-integrated mean absolute errors are used to quantify model performance. The results of the benchmark are usedmore » to discuss input uncertainties from mesoscale modelling, different meso-micro coupling strategies (online vs offline) and consistency between RANS and LES codes when dealing with boundary-layer mean flow quantities. Altogether, all the microscale simulations produce a consistent coupling with mesoscale forcings.« less
Results of the GABLS3 diurnal-cycle benchmark for wind energy applications
Rodrigo, J. Sanz; Allaerts, D.; Avila, M.; ...
2017-06-13
We present results of the GABLS3 model intercomparison benchmark revisited for wind energy applications. The case consists of a diurnal cycle, measured at the 200-m tall Cabauw tower in the Netherlands, including a nocturnal low-level jet. The benchmark includes a sensitivity analysis of WRF simulations using two input meteorological databases and five planetary boundary-layer schemes. A reference set of mesoscale tendencies is used to drive microscale simulations using RANS k-ϵ and LES turbulence models. The validation is based on rotor-based quantities of interest. Cycle-integrated mean absolute errors are used to quantify model performance. The results of the benchmark are usedmore » to discuss input uncertainties from mesoscale modelling, different meso-micro coupling strategies (online vs offline) and consistency between RANS and LES codes when dealing with boundary-layer mean flow quantities. Altogether, all the microscale simulations produce a consistent coupling with mesoscale forcings.« less
Jauchem, James R
2011-01-01
Conducted energy weapons (CEWs) are used by law enforcement personnel to incapacitate individuals quickly and effectively, without intending to cause lethality. CEWs have been deployed for relatively long or repeated exposures in some cases. In laboratory animal models, central venous hematocrit has increased significantly after CEW exposure. Even limited applications (e.g., three 5-sec applications) resulted in statistically significant increases in hematocrit. Preexposure hematocrit was significantly higher in nonsurvivors versus survivors after more extreme CEW applications. The purpose of this technical note is to address specific questions that may be generated when examining these results. Comparisons among results of CEW applications, other electrical muscle stimulation, and exercise/voluntary muscle contraction are included. The anesthetized swine appears to be an acceptable animal model for studying changes in hematocrit and associated red blood cell changes. Potential detrimental effects of increased hematocrit, and considerations during law enforcement use, are discussed. 2010 American Academy of Forensic Sciences. Published 2010. This article is a U.S. Government work and is in the public domain in the U.S.A.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper willmore » describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.« less
Advanced Turbine Technology Applications Project (ATTAP)
NASA Technical Reports Server (NTRS)
1994-01-01
Reports technical effort by AlliedSignal Engines in sixth year of DOE/NASA funded project. Topics include: gas turbine engine design modifications of production APU to incorporate ceramic components; fabrication and processing of silicon nitride blades and nozzles; component and engine testing; and refinement and development of critical ceramics technologies, including: hot corrosion testing and environmental life predictive model; advanced NDE methods for internal flaws in ceramic components; and improved carbon pulverization modeling during impact. ATTAP project is oriented toward developing high-risk technology of ceramic structural component design and fabrication to carry forward to commercial production by 'bridging the gap' between structural ceramics in the laboratory and near-term commercial heat engine application. Current ATTAP project goal is to support accelerated commercialization of advanced, high-temperature engines for hybrid vehicles and other applications. Project objectives are to provide essential and substantial early field experience demonstrating ceramic component reliability and durability in modified, available, gas turbine engine applications; and to scale-up and improve manufacturing processes of ceramic turbine engine components and demonstrate application of these processes in the production environment.
Nonlinear modeling of chaotic time series: Theory and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casdagli, M.; Eubank, S.; Farmer, J.D.
1990-01-01
We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less
Recent advances in plasma modeling for space applications
NASA Astrophysics Data System (ADS)
Srinivasan, Bhuvana; Scales, Wayne; Cagas, Petr; Glesner, Colin
2017-02-01
This paper presents a brief overview of the application of advanced plasma modeling techniques to several space science and engineering problems currently of significant interest. Recent advances in both kinetic and fluid modeling provide the ability to study a wide variety of problems that may be important to space plasmas including spacecraft-environment interactions, plasma-material interactions for propulsion systems such as Hall thrusters, ionospheric plasma instabilities, plasma separation from magnetic nozzles, active space experiments, and a host of additional problems. Some of the key findings are summarized here.
The use of photogrammetric and stereophotogrammetric methods in aerodynamic experiments
NASA Astrophysics Data System (ADS)
Shmyreva, V. N.; Iakovlev, V. A.
The possibilities afforded by photogrammetry and stereophotogrammetry in current aerodynamic experiments, methods of image recording, and observation data processing are briefly reviewed. Some specific experiments illustrating the application of stereophotogrammetry are described. The applications discussed include the monitoring of model position in wind tunnels, determination of model deformations and displacements, determination of the deformations of real structural elements in static strength tests, and solution of a variety of problems in hydrodynamics.
Automatic 3d Building Model Generations with Airborne LiDAR Data
NASA Astrophysics Data System (ADS)
Yastikli, N.; Cetin, Z.
2017-11-01
LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D building models can be generated successfully using raw LiDAR point cloud data.
Clinical Applications of 3D Printing: Primer for Radiologists.
Ballard, David H; Trace, Anthony Paul; Ali, Sayed; Hodgdon, Taryn; Zygmont, Matthew E; DeBenedectis, Carolynn M; Smith, Stacy E; Richardson, Michael L; Patel, Midhir J; Decker, Summer J; Lenchik, Leon
2018-01-01
Three-dimensional (3D) printing refers to a number of manufacturing technologies that create physical models from digital information. Radiology is poised to advance the application of 3D printing in health care because our specialty has an established history of acquiring and managing the digital information needed to create such models. The 3D Printing Task Force of the Radiology Research Alliance presents a review of the clinical applications of this burgeoning technology, with a focus on the opportunities for radiology. Topics include uses for treatment planning, medical education, and procedural simulation, as well as patient education. Challenges for creating custom implantable devices including financial and regulatory processes for clinical application are reviewed. Precedent procedures that may translate to this new technology are discussed. The task force identifies research opportunities needed to document the value of 3D printing as it relates to patient care. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Building information modelling review with potential applications in tunnel engineering of China.
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
Building information modelling review with potential applications in tunnel engineering of China
Zhou, Weihong; Qin, Haiyang; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-01-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance. PMID:28878970
Building information modelling review with potential applications in tunnel engineering of China
NASA Astrophysics Data System (ADS)
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
A Course for All Students: Foundations of Modern Engineering
ERIC Educational Resources Information Center
Best, Charles L.
1971-01-01
Describes a course for non-engineering students at Lafayette College which includes the design process in a project. Also included are the study of modeling, optimization, simulation, computer application, and simple feedback controls. (Author/TS)
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.
Space Shuttle propulsion performance reconstruction from flight data
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The aplication of extended Kalman filtering to estimating Space Shuttle Solid Rocket Booster (SRB) performance, specific impulse, from flight data in a post-flight processing computer program. The flight data used includes inertial platform acceleration, SRB head pressure, and ground based radar tracking data. The key feature in this application is the model used for the SRBs, which represents a reference quasi-static internal ballistics model normalized to the propellant burn depth. Dynamic states of mass overboard and propellant burn depth are included in the filter model to account for real-time deviations from the reference model used. Aerodynamic, plume, wind and main engine uncertainties are included.
Rigid aggregates: theory and applications
NASA Astrophysics Data System (ADS)
Richardson, D. C.
2005-08-01
Numerical models employing ``perfect'' self-gravitating rubble piles that consist of monodisperse rigid spheres with configurable contact dissipation have been used to explore collisional and rotational disruption of gravitational aggregates. Applications of these simple models include numerical simulations of planetesimal evolution, asteroid family formation, tidal disruption, and binary asteroid formation. These studies may be limited by the idealized nature of the rubble pile model, since perfect identical spheres stack and shear in a very specific, possibly over-idealized way. To investigate how constituent properties affect the overall characteristics of a gravitational aggregate, particularly its failure modes, we have generalized our numerical code to model colliding, self-gravitating, rigid aggregates made up of variable-size spheres. Euler's equation of rigid-body motion in the presence of external torques are implemented, along with a self-consistent prescription for handling non-central impacts. Simple rules for sticking and breaking are also included. Preliminary results will be presented showing the failure modes of gravitational aggregates made up of smaller, rigid, non-idealized components. Applications of this new capability include more realistic aggregate models, convenient modeling of arbitrary rigid shapes for studies of the stability of orbiting companions (replacing one or both bodies with rigid aggregates eliminates expensive interparticle collisions while preserving the shape, spin, and gravity field of the bodies), and sticky particle aggregation in dense planetary rings. This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NAG511722 issued through the Office of Space Science and by the National Science Foundation under Grant No. AST0307549.
Xiong, Wei; Hupert, Nathaniel; Hollingsworth, Eric B; O'Brien, Megan E; Fast, Jessica; Rodriguez, William R
2008-01-01
Background Mathematical modeling has been applied to a range of policy-level decisions on resource allocation for HIV care and treatment. We describe the application of classic operations research (OR) techniques to address logistical and resource management challenges in HIV treatment scale-up activities in resource-limited countries. Methods We review and categorize several of the major logistical and operational problems encountered over the last decade in the global scale-up of HIV care and antiretroviral treatment for people with AIDS. While there are unique features of HIV care and treatment that pose significant challenges to effective modeling and service improvement, we identify several analogous OR-based solutions that have been developed in the service, industrial, and health sectors. Results HIV treatment scale-up includes many processes that are amenable to mathematical and simulation modeling, including forecasting future demand for services; locating and sizing facilities for maximal efficiency; and determining optimal staffing levels at clinical centers. Optimization of clinical and logistical processes through modeling may improve outcomes, but successful OR-based interventions will require contextualization of response strategies, including appreciation of both existing health care systems and limitations in local health workforces. Conclusion The modeling techniques developed in the engineering field of operations research have wide potential application to the variety of logistical problems encountered in HIV treatment scale-up in resource-limited settings. Increasing the number of cross-disciplinary collaborations between engineering and public health will help speed the appropriate development and application of these tools. PMID:18680594
Mathematical Storage-Battery Models
NASA Technical Reports Server (NTRS)
Chapman, C. P.; Aston, M.
1985-01-01
Empirical formula represents performance of electrical storage batteries. Formula covers many battery types and includes numerous coefficients adjusted to fit peculiarities of each type. Battery and load parameters taken into account include power density in battery, discharge time, and electrolyte temperature. Applications include electric-vehicle "fuel" gages and powerline load leveling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onishi, Y.; Serne, R.J.; Arnold, E.M.
This report describes the results of a detailed literature review of radionuclide transport models applicable to rivers, estuaries, coastal waters, the Great Lakes, and impoundments. Some representatives sediment transport and water quality models were also reviewed to evaluate if they can be readily adapted to radionuclide transport modeling. The review showed that most available transport models were developed for dissolved radionuclide in rivers. These models include the mechanisms of advection, dispersion, and radionuclide decay. Since the models do not include sediment and radionuclide interactions, they are best suited for simulating short-term radionuclide migration where: (1) radionuclides have small distribution coefficients;more » (2) sediment concentrations in receiving water bodies are very low. Only 5 of the reviewed models include full sediment and radionuclide interactions: CHMSED developed by Fields; FETRA SERATRA, and TODAM developed by Onishi et al, and a model developed by Shull and Gloyna. The 5 models are applicable to cases where: (1) the distribution coefficient is large; (2) sediment concentrations are high; or (3) long-term migration and accumulation are under consideration. The report also discusses radionuclide absorption/desorption distribution ratios and addresses adsorption/desorption mechanisms and their controlling processes for 25 elements under surface water conditions. These elements are: Am, Sb, C, Ce, Cm, Co, Cr, Cs, Eu, I, Fe, Mn, Np, P, Pu, Pm, Ra, Ru, Sr, Tc, Th, {sup 3}H, U, Zn and Zr.« less
The Global Climate Assessment Model (GCAM) is a global integrated assessment model used for exploring future scenarios and examining strategies that address air pollution, climate change, and energy goals. GCAM includes technology-rich representations of the energy, transportati...
Space-Time Fusion Under Error in Computer Model Output: An Application to Modeling Air Quality
In the last two decades a considerable amount of research effort has been devoted to modeling air quality with public health objectives. These objectives include regulatory activities such as setting standards along with assessing the relationship between exposure to air pollutan...
Systems Biology and Bioinformatics in Medical Applications
2009-10-01
animal models, including murine (21, 22, 25, 26, 31, 36) and guinea pig (4) pneumonia models, a rat thigh infection model (27), and a rabbit endocarditis ...Acinetobacter baumannii endocarditis . Clin. Mi- crobiol. Infect. 10:581–584. 36. Rodriguez-Hernandez, M. J., J. Pachon, C. Pichardo, L. Cuberos, J. Ibanez
Mission Impossible? Social Work Practice with Black Urban Youth Gangs.
ERIC Educational Resources Information Center
Fox, Jerry R.
1985-01-01
Describes the adaptation of social work practice skills to serve black urban youth gangs. Presents a model for practice which respects youths' right to self-determination and community needs. Model stages discussed include contact, rapport, setting goals, assigning roles, procuring resources, and evaluation. Model applicability is suggested. (NRB)
Coastal waters are modeled for a variety of purposes including eutrophication remediation and fisheries management. Combining these two approaches provides insights which are not available from either approach independently. Coupling is confounded, however, by differences in mode...
Applications of the Wilkinson Model of Writing Maturity to College Writing.
ERIC Educational Resources Information Center
Sternglass, Marilyn
1982-01-01
Examines the four-category model developed by Andrew Wilkinson at the University of Essex (England) to assess growth in writing maturity. The four measures of development are stylistic, affective, cognitive, and moral. Each has several subcategories. Includes college student essays to illustrate the model. (HTH)
The Vroom and Yetton Normative Leadership Model Applied to Public School Case Examples.
ERIC Educational Resources Information Center
Sample, John
This paper seeks to familiarize school administrators with the Vroom and Yetton Normative Leadership model by presenting its essential components and providing original case studies for its application to school settings. The five decision-making methods of the Vroom and Yetton model, including two "autocratic," two…
Occupational Therapy and Video Modeling for Children with Autism
ERIC Educational Resources Information Center
Becker, Emily Ann; Watry-Christian, Meghan; Simmons, Amanda; Van Eperen, Ashleigh
2016-01-01
This review explores the evidence in support of using video modeling for teaching children with autism. The process of implementing video modeling, the use of various perspectives, and a wide range of target skills are addressed. Additionally, several helpful clinician resources including handheld device applications, books, and websites are…
An analytical solution for percutaneous drug absorption: application and removal of the vehicle.
Simon, L; Loney, N W
2005-10-01
The methods of Laplace transform were used to solve a mathematical model developed for percutaneous drug absorption. This model includes application and removal of the vehicle from the skin. A system of two linear partial differential equations was solved for the application period. The concentration of the medicinal agent in the skin at the end of the application period was used as the initial condition to determine the distribution of the drug in the skin following instantaneous removal of the vehicle. The influences of the diffusion and partition coefficients, clearance factor and vehicle layer thickness on the amount of drug in the vehicle and the skin were discussed.
Wang, Monan; Zhang, Kai; Yang, Ning
2018-04-09
To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.
An International Survey of Industrial Applications of Formal Methods. Volume 2. Case Studies
1993-09-30
impact of the product on IBM revenues. 4. Error rates were claimed to be below industrial average and errors were minimal to fix. Formal methods, as...critical applications. These include: 3 I I International Survey of Industrial Applications 41 i) "Software failures, particularly under first use, seem...project to add improved modelling capability. I U International Survey of Industrial Applications 93 I Design and Implementation These products are being
NASA Astrophysics Data System (ADS)
Wu, Q. H.; Ma, J. T.
1993-09-01
A primary investigation into application of genetic algorithms in optimal reactive power dispatch and voltage control is presented. The application was achieved, based on (the United Kingdom) National Grid 48 bus network model, using a novel genetic search approach. Simulation results, compared with that obtained using nonlinear programming methods, are included to show the potential of applications of the genetic search methodology in power system economical and secure operations.
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.
Multi-source micro-friction identification for a class of cable-driven robots with passive backbone
NASA Astrophysics Data System (ADS)
Tjahjowidodo, Tegoeh; Zhu, Ke; Dailey, Wayne; Burdet, Etienne; Campolo, Domenico
2016-12-01
This paper analyses the dynamics of cable-driven robots with a passive backbone and develops techniques for their dynamic identification, which are tested on the H-Man, a planar cabled differential transmission robot for haptic interaction. The mechanism is optimized for human-robot interaction by accounting for the cost-benefit-ratio of the system, specifically by eliminating the necessity of an external force sensor to reduce the overall cost. As a consequence, this requires an effective dynamic model for accurate force feedback applications which include friction behavior in the system. We first consider the significance of friction in both the actuator and backbone spaces. Subsequently, we study the required complexity of the stiction model for the application. Different models representing different levels of complexity are investigated, ranging from the conventional approach of Coulomb to an advanced model which includes hysteresis. The results demonstrate each model's ability to capture the dynamic behavior of the system. In general, it is concluded that there is a trade-off between model accuracy and the model cost.
NASA Astrophysics Data System (ADS)
Tashiro, Tohru
2014-03-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henager, Charles H.; Nguyen, Ba Nghiep; Kurtz, Richard J.
2016-03-31
Finite element continuum damage models (FE-CDM) have been developed to simulate and model dual-phase joints and cracked joints for improved analysis of SiC materials in nuclear environments. This report extends the analysis from the last reporting cycle by including results from dual-phase models and from cracked joint models.
Hierarchy of simulation models for a turbofan gas engine
NASA Technical Reports Server (NTRS)
Longenbaker, W. E.; Leake, R. J.
1977-01-01
Steady-state and transient performance of an F-100-like turbofan gas engine are modeled by a computer program, DYNGEN, developed by NASA. The model employs block data maps and includes about 25 states. Low-order nonlinear analytical and linear techniques are described in terms of their application to the model. Experimental comparisons illustrating the accuracy of each model are presented.
Turbulent Flow past High Temperature Surfaces
NASA Astrophysics Data System (ADS)
Mehmedagic, Igbal; Thangam, Siva; Carlucci, Pasquale; Buckley, Liam; Carlucci, Donald
2014-11-01
Flow over high-temperature surfaces subject to wall heating is analyzed with applications to projectile design. In this study, computations are performed using an anisotropic Reynolds-stress model to study flow past surfaces that are subject to radiative flux. The model utilizes a phenomenological treatment of the energy spectrum and diffusivities of momentum and heat to include the effects of wall heat transfer and radiative exchange. The radiative transport is modeled using Eddington approximation including the weighted effect of nongrayness of the fluid. The time-averaged equations of motion and energy are solved using the modeled form of transport equations for the turbulence kinetic energy and the scalar form of turbulence dissipation with an efficient finite-volume algorithm. The model is applied for available test cases to validate its predictive capabilities for capturing the effects of wall heat transfer. Computational results are compared with experimental data available in the literature. Applications involving the design of projectiles are summarized. Funded in part by U.S. Army, ARDEC.
A Summary of the NASA Lightning Nitrogen Oxides Model (LNOM) and Recent Results
NASA Technical Reports Server (NTRS)
Koshak, William; Peterson, Harld
2011-01-01
The NASA Marshall Space Flight Center introduced the Lightning Nitrogen Oxides Model (LNOM) a couple of years ago to combine routine state-of-the-art measurements of lightning with empirical laboratory results of lightning NOx production. The routine measurements included VHF lightning source data [such as from the North Alabama Lightning Mapping Array (LMA)], and ground flash location, peak current, and stroke multiplicity data from the National Lightning Detection Network(TradeMark) (NLDN). Following these initial runs of LNOM, the model was updated to include several non-return stroke lightning NOx production mechanisms, and provided the impact of lightning NOx on an August 2006 run of CMAQ. In this study, we review the evolution of the LNOM in greater detail and discuss the model?s latest upgrades and applications. Whereas previous applications were limited to five summer months of data for North Alabama thunderstorms, the most recent LNOM analyses cover several years. The latest statistics of ground and cloud flash NOx production are provided.
Applications of active adaptive noise control to jet engines
NASA Technical Reports Server (NTRS)
Shoureshi, Rahmat; Brackney, Larry
1993-01-01
During phase 2 research on the application of active noise control to jet engines, the development of multiple-input/multiple-output (MIMO) active adaptive noise control algorithms and acoustic/controls models for turbofan engines were considered. Specific goals for this research phase included: (1) implementation of a MIMO adaptive minimum variance active noise controller; and (2) turbofan engine model development. A minimum variance control law for adaptive active noise control has been developed, simulated, and implemented for single-input/single-output (SISO) systems. Since acoustic systems tend to be distributed, multiple sensors, and actuators are more appropriate. As such, the SISO minimum variance controller was extended to the MIMO case. Simulation and experimental results are presented. A state-space model of a simplified gas turbine engine is developed using the bond graph technique. The model retains important system behavior, yet is of low enough order to be useful for controller design. Expansion of the model to include multiple stages and spools is also discussed.
Cross-cultural re-entry for missionaries: a new application for the Dual Process Model.
Selby, Susan; Clark, Sheila; Braunack-Mayer, Annette; Jones, Alison; Moulding, Nicole; Beilby, Justin
Nearly half a million foreign aid workers currently work worldwide, including over 140,000 missionaries. During re-entry these workers may experience significant psychological distress. This article positions previous research about psychological distress during re-entry, emphasizing loss and grief. At present there is no identifiable theoretical framework to provide a basis for assessment, management, and prevention of re-entry distress in the clinical setting. The development of theoretical concepts and frameworks surrounding loss and grief including the Dual Process Model (DPM) are discussed. All the parameters of the DPM have been shown to be appropriate for the proposed re-entry model, the Dual Process Model applied to Re-entry (DPMR). It is proposed that the DPMR is an appropriate framework to address the processes and strategies of managing re-entry loss and grief. Possible future clinical applications and limitations of the proposed model are discussed. The DPMR is offered for further validation and use in clinical practice.
Aeroelastic Analysis for Rotorcraft in Flight or in a Wind Tunnel
NASA Technical Reports Server (NTRS)
Johnson, W.
1977-01-01
An analytical model is developed for the aeroelastic behavior of a rotorcraft in flight or in a wind tunnel. A unified development is presented for a wide class of rotors, helicopters, and operating conditions. The equations of motion for the rotor are derived using an integral Newtonian method, which gives considerable physical insight into the blade inertial and aerodynamic forces. The rotor model includes coupled flap-lag bending and blade torsion degrees of freedom, and is applicable to articulated, hingeless, gimballed, and teetering rotors with an arbitrary number of blades. The aerodynamic model is valid for both high and low inflow, and for axial and nonaxial flight. The rotor rotational speed dynamics, including engine inertia and damping, and the perturbation inflow dynamics are included. For a rotor on a wind-tunnel support, a normal mode representation of the test module, strut, and balance system is used. The aeroelastic analysis for the rotorcraft in flight is applicable to a general two-rotor aircraft, including single main-rotor and tandem helicopter configurations, and side-by-side or tilting proprotor aircraft configurations.
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
Organism and population-level ecological models for ...
Ecological risk assessment typically focuses on animal populations as endpoints for regulatory ecotoxicology. Scientists at USEPA are developing models for animal populations exposed to a wide range of chemicals from pesticides to emerging contaminants. Modeled taxa include aquatic and terrestrial invertebrates, fish, amphibians, and birds, and employ a wide range of methods, from matrix-based projection models to mechanistic bioenergetics models and spatially explicit population models. not applicable
Parametric modeling of wideband piezoelectric polymer sensors: Design for optoacoustic applications
NASA Astrophysics Data System (ADS)
Fernández Vidal, A.; Ciocci Brazzano, L.; Matteo, C. L.; Sorichetti, P. A.; González, M. G.
2017-09-01
In this work, we present a three-dimensional model for the design of wideband piezoelectric polymer sensors which includes the geometry and the properties of the transducer materials. The model uses FFT and numerical integration techniques in an explicit, semi-analytical approach. To validate the model, we made electrical and mechanical measurements on homemade sensors for optoacoustic applications. Each device was implemented using a polyvinylidene fluoride thin film piezoelectric polymer with a thickness of 25 μm. The sensors had detection areas in the range between 0.5 mm2 and 35 mm2 and were excited by acoustic pressure pulses of 5 ns (FWHM) from a source with a diameter around 10 μm. The experimental data obtained from the measurements agree well with the model results. We discuss the relative importance of the sensor design parameters for optoacoustic applications and we provide guidelines for the optimization of devices.
Parametric modeling of wideband piezoelectric polymer sensors: Design for optoacoustic applications.
Fernández Vidal, A; Ciocci Brazzano, L; Matteo, C L; Sorichetti, P A; González, M G
2017-09-01
In this work, we present a three-dimensional model for the design of wideband piezoelectric polymer sensors which includes the geometry and the properties of the transducer materials. The model uses FFT and numerical integration techniques in an explicit, semi-analytical approach. To validate the model, we made electrical and mechanical measurements on homemade sensors for optoacoustic applications. Each device was implemented using a polyvinylidene fluoride thin film piezoelectric polymer with a thickness of 25 μm. The sensors had detection areas in the range between 0.5 mm 2 and 35 mm 2 and were excited by acoustic pressure pulses of 5 ns (FWHM) from a source with a diameter around 10 μm. The experimental data obtained from the measurements agree well with the model results. We discuss the relative importance of the sensor design parameters for optoacoustic applications and we provide guidelines for the optimization of devices.
Mathur, Rohit; Xing, Jia; Gilliam, Robert; Sarwar, Golam; Hogrefe, Christian; Pleim, Jonathan; Pouliot, George; Roselle, Shawn; Spero, Tanya L.; Wong, David C.; Young, Jeffrey
2018-01-01
The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modelled processes were examined and enhanced to suitably represent the extended space and time scales for such applications. Hemispheric scale simulations with CMAQ and the Weather Research and Forecasting (WRF) model are performed for multiple years. Model capabilities for a range of applications including episodic long-range pollutant transport, long-term trends in air pollution across the Northern Hemisphere, and air pollution-climate interactions are evaluated through detailed comparison with available surface, aloft, and remotely sensed observations. The expansion of CMAQ to simulate the hemispheric scales provides a framework to examine interactions between atmospheric processes occurring at various spatial and temporal scales with physical, chemical, and dynamical consistency. PMID:29681922
NASA Technical Reports Server (NTRS)
Lawing, P. L.
1985-01-01
A method of constructing airfoils by inscribing pressure channels on the face of opposing plates, bonding them together to form one plate with integral channels, and contour machining this plate to form an airfoil model is described. The research and development program to develop the bonding technology is described as well as the construction and testing of an airfoil model. Sample aerodynamic data sets are presented and discussed. Also, work currently under way to produce thin airfoils with camber is presented. Samples of the aft section of a 6 percent airfoil with complete pressure instrumentation including the trailing edge are pictured and described. This technique is particularly useful in fabricating models for transonic cryogenic testing, but it should find application in a wide ange of model construction projects, as well as the fabrication of fuel injectors, space hardware, and other applications requiring advanced bonding technology and intricate fluid passages.
Telerobotic system performance measurement - Motivation and methods
NASA Technical Reports Server (NTRS)
Kondraske, George V.; Khoury, George J.
1992-01-01
A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.
Information Technology: Making It All Fit. Track VI: Outstanding Applications.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Seven papers from the 1988 CAUSE conference's Track VI, Outstanding Applications, are presented. They include: "Designing DB2 Data Bases Using Entity-Relationship Modeling: A Case Study--The LSU System Worker's Compensation Project" (Cynthia M. Hadden and Sara G. Zimmerman); "Integrating Information Technology: Prerequisites for…
Application of wheat yield model to United States and India. [Great Plains
NASA Technical Reports Server (NTRS)
Feyerherm, A. M. (Principal Investigator)
1977-01-01
The author has identified the following significant results. The wheat yield model was applied to the major wheat-growing areas of the US and India. In the US Great Plains, estimates from the winter and spring wheat models agreed closely with USDA-SRS values in years with the lowest yields, but underestimated in years with the highest yields. Application to the Eastern Plains and Northwest indicated the importance of cultural factors, as well as meteorological ones in the model. It also demonstrated that the model could be used, in conjunction with USDA-SRRS estimates, to estimate yield losses due to factors not included in the model, particularly diseases and freezes. A fixed crop calendar for India was built from a limited amount of available plot data from that country. Application of the yield model gave measurable evidence that yield variation from state to state was due to different mixes of levels of meteorological and cultural factors.
Towards more accurate and reliable predictions for nuclear applications
NASA Astrophysics Data System (ADS)
Goriely, Stephane; Hilaire, Stephane; Dubray, Noel; Lemaître, Jean-François
2017-09-01
The need for nuclear data far from the valley of stability, for applications such as nuclear astrophysics or future nuclear facilities, challenges the robustness as well as the predictive power of present nuclear models. Most of the nuclear data evaluation and prediction are still performed on the basis of phenomenological nuclear models. For the last decades, important progress has been achieved in fundamental nuclear physics, making it now feasible to use more reliable, but also more complex microscopic or semi-microscopic models in the evaluation and prediction of nuclear data for practical applications. Nowadays mean-field models can be tuned at the same level of accuracy as the phenomenological models, renormalized on experimental data if needed, and therefore can replace the phenomenological inputs in the evaluation of nuclear data. The latest achievements to determine nuclear masses within the non-relativistic HFB approach, including the related uncertainties in the model predictions, are discussed. Similarly, recent efforts to determine fission observables within the mean-field approach are described and compared with more traditional existing models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumazaki, Y; Miyaura, K; Hirai, R
2015-06-15
Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less
Using the Research and Development in Organisations Model to Improve Transition to High School
ERIC Educational Resources Information Center
Ashton, Rebecca
2009-01-01
This article describes the application of the Research and Development in Organisations (RADIO) model to five action research projects carried out in schools around transition processes. The RADIO model is mapped onto all five studies, and adapting the model in order to include greater stakeholder participation is suggested. Reflections are made…
2010-02-01
98 8.4.5 Training Screening ............................. .................................................................99 8.5 Experimental...associated with the proposed parametric model. Several im- portant issues are discussed, including model order selection, training screening , and time...parameters associated with the NS-AR model. In addition, we develop model order selection, training screening , and time-series based whitening and
Hyper-Book: A Formal Model for Electronic Books.
ERIC Educational Resources Information Center
Catenazzi, Nadia; Sommaruga, Lorenzo
1994-01-01
Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…
The Polar Ionosphere and Interplanetary Field.
1987-08-01
model for investigating time dependent behavior of the Polar F-region ionosphere in response to varying interplanetary magnetic field (IMF...conditions. The model has been used to illustrate ionospheric behavior during geomagnetic storms conditions. Future model applications may include...magnetosphere model for investigating time dependent behavior of the polar F-region ionosphere in response to varying interplanetary magnetic field
Agro-economic impact of cattle cloning.
Faber, D C; Ferre, L B; Metzger, J; Robl, J M; Kasinathan, P
2004-01-01
The purpose of this paper is to review the economic and social implications of cloned cattle, their products, and their offspring as related to production agriculture. Cloning technology in cattle has several applications outside of traditional production agriculture. These applications can include bio-medical applications, such as the production of pharmaceuticals in the blood or milk of transgenic cattle. Cloning may also be useful in the production of research models. These models may or may not include genetic modifications. Uses in agriculture include many applications of the technology. These include making genetic copies of elite seed stock and prize winning show cattle. Other purposes may range from "insurance" to making copies of cattle that have sentimental value, similar to cloning of pets. Increased selection opportunities available with cloning may provide for improvement in genetic gain. The ultimate goal of cloning has often been envisioned as a system for producing quantity and uniformity of the perfect dairy cow. However, only if heritability were 100%, would clone mates have complete uniformity. Changes in the environment may have significant impact on the productivity and longevity of the resulting clones. Changes in consumer preferences and economic input costs may all change the definition of the perfect cow. The cost of producing such animals via cloning must be economically feasible to meet the intended applications. Present inefficiencies limit cloning opportunities to highly valued animals. Improvements are necessary to move the applications toward commercial application. Cloning has additional obstacles to conquer. Social and regulatory acceptance of cloning is paramount to its utilization in production agriculture. Regulatory acceptance will need to address the animal, its products, and its offspring. In summary, cloning is another tool in the animal biotechnology toolbox, which includes artificial insemination, sexing of semen, embryo sexing and in vitro fertilization. While it will not replace any of the above mentioned, its degree of utilization will depend on both improvement in efficiency as well as social and regulatory acceptance.
NASA Astrophysics Data System (ADS)
Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.
During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.
VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi
2018-04-17
Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.
Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh
2010-02-01
Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.
NASA Technical Reports Server (NTRS)
Jackson, C. E., Jr.
1977-01-01
A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.
Annotated bibliography of structural equation modelling: technical work.
Austin, J T; Wolfle, L M
1991-05-01
Researchers must be familiar with a variety of source literature to facilitate the informed use of structural equation modelling. Knowledge can be acquired through the study of an expanding literature found in a diverse set of publishing forums. We propose that structural equation modelling publications can be roughly classified into two groups: (a) technical and (b) substantive applications. Technical materials focus on the procedures rather than substantive conclusions derived from applications. The focus of this article is the former category; included are foundational/major contributions, minor contributions, critical and evaluative reviews, integrations, simulations and computer applications, precursor and historical material, and pedagogical textbooks. After a brief introduction, we annotate 294 articles in the technical category dating back to Sewall Wright (1921).
NASA Technical Reports Server (NTRS)
Andrews, J.
1977-01-01
An optimal decision model of crop production, trade, and storage was developed for use in estimating the economic consequences of improved forecasts and estimates of worldwide crop production. The model extends earlier distribution benefits models to include production effects as well. Application to improved information systems meeting the goals set in the large area crop inventory experiment (LACIE) indicates annual benefits to the United States of $200 to $250 million for wheat, $50 to $100 million for corn, and $6 to $11 million for soybeans, using conservative assumptions on expected LANDSAT system performance.
Application of Zebrafish Model to Environmental Toxicology.
Komoike, Yuta; Matsuoka, Masato
2016-01-01
Recently, a tropical freshwater fish, the zebrafish, has been generally used as a useful model organism in various fields of life science worldwide. The zebrafish model has also been applied to environmental toxicology; however, in Japan, it has not yet become widely used. In this review, we will introduce the biological and historical backgrounds of zebrafish as an animal model and their breeding. We then present the current status of toxicological experiments using zebrafish that were treated with some important environmental contaminants, including cadmium, organic mercury, 2,3,7,8-tetrachlorodibenzo-p-dioxin, and tributyltin. Finally, the future possible application of genetically modified zebrafish to the study of environmental toxicology is discussed.
Application of Mouse Models to Research in Hearing and Balance.
Ohlemiller, Kevin K; Jones, Sherri M; Johnson, Kenneth R
2016-12-01
Laboratory mice (Mus musculus) have become the major model species for inner ear research. The major uses of mice include gene discovery, characterization, and confirmation. Every application of mice is founded on assumptions about what mice represent and how the information gained may be generalized. A host of successes support the continued use of mice to understand hearing and balance. Depending on the research question, however, some mouse models and research designs will be more appropriate than others. Here, we recount some of the history and successes of the use of mice in hearing and vestibular studies and offer guidelines to those considering how to apply mouse models.
Applications of the hybrid coordinate method to the TOPS autopilot
NASA Technical Reports Server (NTRS)
Fleischer, G. E.
1978-01-01
Preliminary results are presented from the application of the hybrid coordinate method to modeling TOPS (thermoelectric outer planet spacecraft) structural dynamics. Computer simulated responses of the vehicle are included which illustrate the interaction of relatively flexible appendages with an autopilot control system. Comparisons were made between simplified single-axis models of the control loop, with spacecraft flexibility represented by hinged rigid bodies, and a very detailed three-axis spacecraft model whose flexible portions are described by modal coordinates. While single-axis system, root loci provided reasonable qualitative indications of stability margins in this case, they were quantitatively optimistic when matched against responses of the detailed model.
NASA Astrophysics Data System (ADS)
Gavrishchaka, V. V.; Ganguli, S. B.
2001-12-01
Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.
NASA Astrophysics Data System (ADS)
Masson, V.; Le Moigne, P.; Martin, E.; Faroux, S.; Alias, A.; Alkama, R.; Belamari, S.; Barbu, A.; Boone, A.; Bouyssel, F.; Brousseau, P.; Brun, E.; Calvet, J.-C.; Carrer, D.; Decharme, B.; Delire, C.; Donier, S.; Essaouini, K.; Gibelin, A.-L.; Giordani, H.; Habets, F.; Jidane, M.; Kerdraon, G.; Kourzeneva, E.; Lafaysse, M.; Lafont, S.; Lebeaupin Brossier, C.; Lemonsu, A.; Mahfouf, J.-F.; Marguinaud, P.; Mokhtari, M.; Morin, S.; Pigeon, G.; Salgado, R.; Seity, Y.; Taillefer, F.; Tanguy, G.; Tulet, P.; Vincendon, B.; Vionnet, V.; Voldoire, A.
2013-07-01
SURFEX is a new externalized land and ocean surface platform that describes the surface fluxes and the evolution of four types of surfaces: nature, town, inland water and ocean. It is mostly based on pre-existing, well-validated scientific models that are continuously improved. The motivation for the building of SURFEX is to use strictly identical scientific models in a high range of applications in order to mutualise the research and development efforts. SURFEX can be run in offline mode (0-D or 2-D runs) or in coupled mode (from mesoscale models to numerical weather prediction and climate models). An assimilation mode is included for numerical weather prediction and monitoring. In addition to momentum, heat and water fluxes, SURFEX is able to simulate fluxes of carbon dioxide, chemical species, continental aerosols, sea salt and snow particles. The main principles of the organisation of the surface are described first. Then, a survey is made of the scientific module (including the coupling strategy). Finally, the main applications of the code are summarised. The validation work undertaken shows that replacing the pre-existing surface models by SURFEX in these applications is usually associated with improved skill, as the numerous scientific developments contained in this community code are used to good advantage.
Application of remote sensing to state and regional problems
NASA Technical Reports Server (NTRS)
Miller, W. F. (Principal Investigator); Tingle, J.; Wright, L. H.; Tebbs, B.
1984-01-01
Progress was made in the hydroclimatology, habitat modeling and inventory, computer analysis, wildlife management, and data comparison programs that utilize LANDSAT and SEASAT data provided to Mississippi researchers through the remote sensing applications program. Specific topics include water runoff in central Mississippi, habitat models for the endangered gopher tortoise, coyote, and turkey Geographic Information Systems (GIS) development, forest inventory along the Mississipppi River, and the merging of LANDSAT and SEASAT data for enhanced forest type discrimination.
1993-04-01
presentations. The topics included Cryoccoler Testing and Modeling , Space and Long Life Applications, Stirling Cryocoolers , Pulse Tube Refrigerators, Novel...Equation (12), derived in the present study can also be used to develop a linear network model of Stirling 1" or pulse - tube cryocoolers by...Applications, Stirling Cryocoolers , Pulse Tube Refrigerators, Novel Concepts and Component Development, Low Temperature Regenerator Development, and J-T and
Model-based metabolism design: constraints for kinetic and stoichiometric models
Stalidzans, Egils; Seiman, Andrus; Peebo, Karl; Komasilovs, Vitalijs; Pentjuss, Agris
2018-01-01
The implementation of model-based designs in metabolic engineering and synthetic biology may fail. One of the reasons for this failure is that only a part of the real-world complexity is included in models. Still, some knowledge can be simplified and taken into account in the form of optimization constraints to improve the feasibility of model-based designs of metabolic pathways in organisms. Some constraints (mass balance, energy balance, and steady-state assumption) serve as a basis for many modelling approaches. There are others (total enzyme activity constraint and homeostatic constraint) proposed decades ago, but which are frequently ignored in design development. Several new approaches of cellular analysis have made possible the application of constraints like cell size, surface, and resource balance. Constraints for kinetic and stoichiometric models are grouped according to their applicability preconditions in (1) general constraints, (2) organism-level constraints, and (3) experiment-level constraints. General constraints are universal and are applicable for any system. Organism-level constraints are applicable for biological systems and usually are organism-specific, but these constraints can be applied without information about experimental conditions. To apply experimental-level constraints, peculiarities of the organism and the experimental set-up have to be taken into account to calculate the values of constraints. The limitations of applicability of particular constraints for kinetic and stoichiometric models are addressed. PMID:29472367
Description and availability of the SMARTS spectral model for photovoltaic applications
NASA Astrophysics Data System (ADS)
Myers, Daryl R.; Gueymard, Christian A.
2004-11-01
Limited spectral response range of photocoltaic (PV) devices requires device performance be characterized with respect to widely varying terrestrial solar spectra. The FORTRAN code "Simple Model for Atmospheric Transmission of Sunshine" (SMARTS) was developed for various clear-sky solar renewable energy applications. The model is partly based on parameterizations of transmittance functions in the MODTRAN/LOWTRAN band model family of radiative transfer codes. SMARTS computes spectra with a resolution of 0.5 nanometers (nm) below 400 nm, 1.0 nm from 400 nm to 1700 nm, and 5 nm from 1700 nm to 4000 nm. Fewer than 20 input parameters are required to compute spectral irradiance distributions including spectral direct beam, total, and diffuse hemispherical radiation, and up to 30 other spectral parameters. A spreadsheet-based graphical user interface can be used to simplify the construction of input files for the model. The model is the basis for new terrestrial reference spectra developed by the American Society for Testing and Materials (ASTM) for photovoltaic and materials degradation applications. We describe the model accuracy, functionality, and the availability of source and executable code. Applications to PV rating and efficiency and the combined effects of spectral selectivity and varying atmospheric conditions are briefly discussed.
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Kelley, Gary W.
2012-01-01
The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.
Russell, Solomon; Distefano, Joseph J
2006-07-01
W(3)MAMCAT is a new web-based and interactive system for building and quantifying the parameters or parameter ranges of n-compartment mammillary and catenary model structures, with input and output in the first compartment, from unstructured multiexponential (sum-of-n-exponentials) models. It handles unidentifiable as well as identifiable models and, as such, provides finite parameter interval solutions for unidentifiable models, whereas direct parameter search programs typically do not. It also tutorially develops the theory of model distinguishability for same order mammillary versus catenary models, as did its desktop application predecessor MAMCAT+. This includes expert system analysis for distinguishing mammillary from catenary structures, given input and output in similarly numbered compartments. W(3)MAMCAT provides for universal deployment via the internet and enhanced application error checking. It uses supported Microsoft technologies to form an extensible application framework for maintaining a stable and easily updatable application. Most important, anybody, anywhere, is welcome to access it using Internet Explorer 6.0 over the internet for their teaching or research needs. It is available on the Biocybernetics Laboratory website at UCLA: www.biocyb.cs.ucla.edu.
Crisis on campus: Eating disorder intervention from a developmental-ecological perspective.
Taylor, Julia V; Gibson, Donna M
2016-01-01
The purpose of this article is to review a crisis intervention using the developmental-ecological protocol (Collins and Collins, 2005) with a college student presenting with symptomatology of an active eating disorder. Participants included University Wellness Center employees responding to the crisis. Methods include an informal review of the crisis intervention response and application of the ABCDE developmental-ecological crisis model. Results reported include insight into crisis intervention when university counseling and health center is not available as resources. ABCDE Developmental-ecological model recommendations for university faculty and staff are included.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Bayard, David S.; Neely, Michael
2016-01-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a nonparametric model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher Information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the nonparametric model. Specifically, the problem of identifying an individual from a nonparametric prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient’s behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (Multiple-Model Optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications. PMID:27909942
Bayard, David S; Neely, Michael
2017-04-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a NP model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the NP model. Specifically, the problem of identifying an individual from a NP prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient's behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (multiple-model optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications.
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level
DOT National Transportation Integrated Search
2017-03-01
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S
2016-05-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.
Application of uniform design to improve dental implant system.
Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei
2015-01-01
This paper introduces the application of uniform experimental design to improve dental implant systems subjected to dynamic loads. The dynamic micromotion of the Zimmer dental implant system is calculated and illustrated by explicit dynamic finite element analysis. Endogenous and exogenous factors influence the success rate of dental implant systems. Endogenous factors include: bone density, cortical bone thickness and osseointegration. Exogenous factors include: thread pitch, thread depth, diameter of implant neck and body size. A dental implant system with a crest module was selected to simulate micromotion distribution and stress behavior under dynamic loads using conventional and proposed methods. Finally, the design which caused minimum micromotion was chosen as the optimal design model. The micromotion of the improved model is 36.42 μm, with an improvement is 15.34% as compared to the original model.
Verification of a one-dimensional, unsteady-flow model for the Fox River in Illinois
Ishii, Audrey L.; Turner, Mary J.
1996-01-01
The previously-calibrated application of the Full EQuations (FEQ) model of one-dimensional, unsteady flow to a 30.7-mile reach of the Fox River in northeastern Illinois was verified with discharge, stage, and dye-transport data collected during a 12-day period in October-November 1990. The period included unsteady flow induced by the operation of a sluice gate dam located at the upstream end of the reach. The model flow field was input to the Branched Lagrangian Transport Model (BLTM) for the simulation of dye transport. The results of the FEQ and BLTM model simulations are compared with the measured data and sensitivity analyses of the model parameters for this application are presented.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Modelling of capillary-driven flow for closed paper-based microfluidic channels
NASA Astrophysics Data System (ADS)
Songok, Joel; Toivakka, Martti
2017-06-01
Paper-based microfluidics is an emerging field focused on creating inexpensive devices, with simple fabrication methods for applications in various fields including healthcare, environmental monitoring and veterinary medicine. Understanding the flow of liquid is important in achieving consistent operation of the devices. This paper proposes capillary models to predict flow in paper-based microfluidic channels, which include a flow accelerating hydrophobic top cover. The models, which consider both non-absorbing and absorbing substrates, are in good agreement with the experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D.; McInnes, L. C.; Woodward, C.
This report is an outcome of the workshop Multiphysics Simulations: Challenges and Opportunities, sponsored by the Institute of Computing in Science (ICiS). Additional information about the workshop, including relevant reading and presentations on multiphysics issues in applications, algorithms, and software, is available via https://sites.google.com/site/icismultiphysics2011/. We consider multiphysics applications from algorithmic and architectural perspectives, where 'algorithmic' includes both mathematical analysis and computational complexity and 'architectural' includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not alwaysmore » practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose some commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities. We also initiate a modest suite of test problems encompassing features present in many applications.« less
Methods for compressible multiphase flows and their applications
NASA Astrophysics Data System (ADS)
Kim, H.; Choe, Y.; Kim, H.; Min, D.; Kim, C.
2018-06-01
This paper presents an efficient and robust numerical framework to deal with multiphase real-fluid flows and their broad spectrum of engineering applications. A homogeneous mixture model incorporated with a real-fluid equation of state and a phase change model is considered to calculate complex multiphase problems. As robust and accurate numerical methods to handle multiphase shocks and phase interfaces over a wide range of flow speeds, the AUSMPW+_N and RoeM_N schemes with a system preconditioning method are presented. These methods are assessed by extensive validation problems with various types of equation of state and phase change models. Representative realistic multiphase phenomena, including the flow inside a thermal vapor compressor, pressurization in a cryogenic tank, and unsteady cavitating flow around a wedge, are then investigated as application problems. With appropriate physical modeling followed by robust and accurate numerical treatments, compressible multiphase flow physics such as phase changes, shock discontinuities, and their interactions are well captured, confirming the suitability of the proposed numerical framework to wide engineering applications.
Neural network applications in telecommunications
NASA Technical Reports Server (NTRS)
Alspector, Joshua
1994-01-01
Neural network capabilities include automatic and organized handling of complex information, quick adaptation to continuously changing environments, nonlinear modeling, and parallel implementation. This viewgraph presentation presents Bellcore work on applications, learning chip computational function, learning system block diagram, neural network equalization, broadband access control, calling-card fraud detection, software reliability prediction, and conclusions.
Consolidated Canadian Results to the HEU Round Robin Exercise
2004-11-01
Niemeyer S, Dudder GB. "Model action plan for nuclear forensics and nuclear attribution." Lawrence Livermore National Laboratory Report UCRL -TR...section 8.) including special warning terms if applicable) Defence R&D Canada - Ottawa 3701 Carling Avenue UNCLASSIFIED Ottawa, ON K IA 0Z4 3. TITLE (the...development. Include the address.) DRDC Ottawa 3701 Carling Avenue K I AOZ4 9a. PROJECT OR GRANT NO. (if appropriate, the applicable research 9b. CONTRACT
Supporting Collaborative Model and Data Service Development and Deployment with DevOps
NASA Astrophysics Data System (ADS)
David, O.
2016-12-01
Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.
NASA Astrophysics Data System (ADS)
Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh
2015-12-01
Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.
Psikuta, Agnes; Koelblen, Barbara; Mert, Emel; Fontana, Piero; Annaheim, Simon
2017-12-07
Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications.
PSIKUTA, Agnes; KOELBLEN, Barbara; MERT, Emel; FONTANA, Piero; ANNAHEIM, Simon
2017-01-01
Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications. PMID:28966294
Multiphysics Simulations: Challenges and Opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, David; McInnes, Lois C.; Woodward, Carol
2013-02-12
We consider multiphysics applications from algorithmic and architectural perspectives, where ‘‘algorithmic’’ includes both mathematical analysis and computational complexity, and ‘‘architectural’’ includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not always practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose somemore » commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities.« less
Power Analysis for Complex Mediational Designs Using Monte Carlo Methods
ERIC Educational Resources Information Center
Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.
2010-01-01
Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex…
Mouse Xenograft Model for Mesothelioma | NCI Technology Transfer Center | TTC
The National Cancer Institute is seeking parties interested in collaborative research to co-develop, evaluate, or commercialize a new mouse model for monoclonal antibodies and immunoconjugates that target malignant mesotheliomas. Applications of the technology include models for screening compounds as potential therapeutics for mesothelioma and for studying the pathology of mesothelioma.
NASA Technical Reports Server (NTRS)
Arnold, Steven M. (Editor); Wong, Terry T. (Editor)
2011-01-01
Topics covered include: An Annotative Review of Multiscale Modeling and its Application to Scales Inherent in the Field of ICME; and A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures.
Application of the MAGIC model to the Glacier Lakes catchments
John O. Reuss
1994-01-01
The MAGIC model (Cosby et al. 1985, 1986) was calibrated for East and West Glacier Lakes, two adjacent high-altitude (3200 m- 3700 m) catchments in the Medicine Bow National Forest of southern Wyoming. This model uses catchment characteristics including weathering rates, soil chemical characteristics, hydrological parameters, and precipitation amounts and composition...
Alchemy and uncertainty: What good are models?
F.L. Bunnell
1989-01-01
Wildlife-habitat models are increasing in abundance, diversity, and use, but symptoms of failure are evident in their application, including misuse, disuse, failure to test, and litigation. Reasons for failure often relate to the different purposes managers and researchers have for using the models to predict and to aid understanding. This paper examines these two...
Families and Deinstitutionalization: An Application of Bronfenbrenner's Social Ecology Model.
ERIC Educational Resources Information Center
Berry, Judy O.
1995-01-01
Applied Bronfenbrenner's social ecology model to families that include a member with a developmental disability and who are making the transition from institution to community. Presents an overview of the model as well as a discussion of counselors' use of it in providing services to families in this situation. (RJM)
Urano, K; Tamaoki, N; Nomura, T
2012-01-01
Transgenic animal models have been used in small numbers in gene function studies in vivo for a period of time, but more recently, the use of a single transgenic animal model has been approved as a second species, 6-month alternative (to the routine 2-year, 2-animal model) used in short-term carcinogenicity studies for generating regulatory application data of new drugs. This article addresses many of the issues associated with the creation and use of one of these transgenic models, the rasH2 mouse, for regulatory science. The discussion includes strategies for mass producing mice with the same stable phenotype, including constructing the transgene, choosing a founder mouse, and controlling both the transgene and background genes; strategies for developing the model for regulatory science, including measurements of carcinogen susceptibility, stability of a large-scale production system, and monitoring for uniform carcinogenicity responses; and finally, efficient use of the transgenic animal model on study. Approximately 20% of mouse carcinogenicity studies for new drug applications in the United States currently use transgenic models, typically the rasH2 mouse. The rasH2 mouse could contribute to animal welfare by reducing the numbers of animals used as well as reducing the cost of carcinogenicity studies. A better understanding of the advantages and disadvantages of the transgenic rasH2 mouse will result in greater and more efficient use of this animal model in the future.
Decision Support Tool Evaluation Report for General NOAA Oil Modeling Environment(GNOME) Version 2.0
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hall, Callie; Zanoni, Vicki; Blonski, Slawomir; D'Sa, Eurico; Estep, Lee; Holland, Donald; Moore, Roxzana F.; Pagnutti, Mary; Terrie, Gregory
2004-01-01
NASA's Earth Science Applications Directorate evaluated the potential of NASA remote sensing data and modeling products to enhance the General NOAA Oil Modeling Environment (GNOME) decision support tool. NOAA's Office of Response and Restoration (OR&R) Hazardous Materials (HAZMAT) Response Division is interested in enhancing GNOME with near-realtime (NRT) NASA remote sensing products on oceanic winds and ocean circulation. The NASA SeaWinds sea surface wind and Jason-1 sea surface height NRT products have potential, as do sea surface temperature and reflectance products from the Moderate Resolution Imaging Spectroradiometer and sea surface reflectance products from Landsat and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer. HAZMAT is also interested in the Advanced Circulation model and the Ocean General Circulation Model. Certain issues must be considered, including lack of data continuity, marginal data redundancy, and data formatting problems. Spatial resolution is an issue for near-shore GNOME applications. Additional work will be needed to incorporate NASA inputs into GNOME, including verification and validation of data products, algorithms, models, and NRT data.
Jet Noise Modeling for Supersonic Business Jet Application
NASA Technical Reports Server (NTRS)
Stone, James R.; Krejsa, Eugene A.; Clark, Bruce J.
2004-01-01
This document describes the development of an improved predictive model for coannular jet noise, including noise suppression modifications applicable to small supersonic-cruise aircraft such as the Supersonic Business Jet (SBJ), for NASA Langley Research Center (LaRC). For such aircraft a wide range of propulsion and integration options are under consideration. Thus there is a need for very versatile design tools, including a noise prediction model. The approach used is similar to that used with great success by the Modern Technologies Corporation (MTC) in developing a noise prediction model for two-dimensional mixer ejector (2DME) nozzles under the High Speed Research Program and in developing a more recent model for coannular nozzles over a wide range of conditions. If highly suppressed configurations are ultimately required, the 2DME model is expected to provide reasonable prediction for these smaller scales, although this has not been demonstrated. It is considered likely that more modest suppression approaches, such as dual stream nozzles featuring chevron or chute suppressors, perhaps in conjunction with inverted velocity profiles (IVP), will be sufficient for the SBJ.
Exploring a model-driven architecture (MDA) approach to health care information systems development.
Raghupathi, Wullianallur; Umar, Amjad
2008-05-01
To explore the potential of the model-driven architecture (MDA) in health care information systems development. An MDA is conceptualized and developed for a health clinic system to track patient information. A prototype of the MDA is implemented using an advanced MDA tool. The UML provides the underlying modeling support in the form of the class diagram. The PIM to PSM transformation rules are applied to generate the prototype application from the model. The result of the research is a complete MDA methodology to developing health care information systems. Additional insights gained include development of transformation rules and documentation of the challenges in the application of MDA to health care. Design guidelines for future MDA applications are described. The model has the potential for generalizability. The overall approach supports limited interoperability and portability. The research demonstrates the applicability of the MDA approach to health care information systems development. When properly implemented, it has the potential to overcome the challenges of platform (vendor) dependency, lack of open standards, interoperability, portability, scalability, and the high cost of implementation.
Developing Information Power Grid Based Algorithms and Software
NASA Technical Reports Server (NTRS)
Dongarra, Jack
1998-01-01
This exploratory study initiated our effort to understand performance modeling on parallel systems. The basic goal of performance modeling is to understand and predict the performance of a computer program or set of programs on a computer system. Performance modeling has numerous applications, including evaluation of algorithms, optimization of code implementations, parallel library development, comparison of system architectures, parallel system design, and procurement of new systems. Our work lays the basis for the construction of parallel libraries that allow for the reconstruction of application codes on several distinct architectures so as to assure performance portability. Following our strategy, once the requirements of applications are well understood, one can then construct a library in a layered fashion. The top level of this library will consist of architecture-independent geometric, numerical, and symbolic algorithms that are needed by the sample of applications. These routines should be written in a language that is portable across the targeted architectures.
Nanopyroxene Grafting with β-Cyclodextrin Monomer for Wastewater Applications.
Nafie, Ghada; Vitale, Gerardo; Carbognani Ortega, Lante; Nassar, Nashaat N
2017-12-06
Emerging nanoparticle technology provides opportunities for environmentally friendly wastewater treatment applications, including those in the large liquid tailings containments in the Alberta oil sands. In this study, we synthesize β-cyclodextrin grafted nanopyroxenes to offer an ecofriendly platform for the selective removal of organic compounds typically present in these types of applications. We carry out computational modeling at the micro level through molecular mechanics and molecular dynamics simulations and laboratory experiments at the macro level to understand the interactions between the synthesized nanomaterials and two-model naphthenic acid molecules (cyclopentanecarboxylic and trans-4-pentylcyclohexanecarboxylic acids) typically existing in tailing ponds. The proof-of-concept computational modeling and experiments demonstrate that monomer grafted nanopyroxene or nano-AE of the sodium iron-silicate aegirine are found to be promising candidates for the removal of polar organic compounds from wastewater, among other applications. These nano-AE offer new possibilities for treating tailing ponds generated by the oil sands industry.
Termites as targets and models for biotechnology.
Scharf, Michael E
2015-01-07
Termites have many unique evolutionary adaptations associated with their eusocial lifestyles. Recent omics research has created a wealth of new information in numerous areas of termite biology (e.g., caste polyphenism, lignocellulose digestion, and microbial symbiosis) with wide-ranging applications in diverse biotechnological niches. Termite biotechnology falls into two categories: (a) termite-targeted biotechnology for pest management purposes, and (b) termite-modeled biotechnology for use in various industrial applications. The first category includes several candidate termiticidal modes of action such as RNA interference, digestive inhibition, pathogen enhancement, antimicrobials, endocrine disruption, and primer pheromone mimicry. In the second category, termite digestomes are deep resources for host and symbiont lignocellulases and other enzymes with applications in a variety of biomass, industrial, and processing applications. Moving forward, one of the most important approaches for accelerating advances in both termite-targeted and termite-modeled biotechnology will be to consider host and symbiont together as a single functional unit.
MODTRAN6: a major upgrade of the MODTRAN radiative transfer code
NASA Astrophysics Data System (ADS)
Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette
2014-06-01
The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.
SpectralNET – an application for spectral graph analysis and visualization
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-01-01
Background Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Results Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). Conclusion SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from . Source code is available upon request. PMID:16236170
SpectralNET--an application for spectral graph analysis and visualization.
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-10-19
Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from http://chembank.broad.harvard.edu/resources/. Source code is available upon request.
NASA Astrophysics Data System (ADS)
Goodwin, Graham. C.; Medioli, Adrian. M.
2013-08-01
Model predictive control has been a major success story in process control. More recently, the methodology has been used in other contexts, including automotive engine control, power electronics and telecommunications. Most applications focus on set-point tracking and use single-sequence optimisation. Here we consider an alternative class of problems motivated by the scheduling of emergency vehicles. Here disturbances are the dominant feature. We develop a novel closed-loop model predictive control strategy aimed at this class of problems. We motivate, and illustrate, the ideas via the problem of fluid deployment of ambulance resources.
Face Processing: Models For Recognition
NASA Astrophysics Data System (ADS)
Turk, Matthew A.; Pentland, Alexander P.
1990-03-01
The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
75 FR 27560 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Proposed... LIHEAP and Detailed Model Plan. OMB No.: 0970-0075. Description: States, including the District of... annual application (Model Plan) that meets the LIHEAP statutory and regulatory requirements prior to...
Proof of Concept for the Trajectory-Level Validation Framework for Traffic Simulation Models
DOT National Transportation Integrated Search
2017-10-30
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
Development of stable isotope mixing models in ecology - Dublin
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Historical development of stable isotope mixing models in ecology
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Perth
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Fremantle
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Development of stable isotope mixing models in ecology - Sydney
More than 40 years ago, stable isotope analysis methods used in geochemistry began to be applied to ecological studies. One common application is using mathematical mixing models to sort out the proportional contributions of various sources to a mixture. Examples include contri...
Coastal Modeling System: Mathematical Formulations and Numerical Methods
2014-03-01
sediment transport , and morphology change. The CMS was designed and developed for coastal inlets and navigation applications, including channel...numerical methods of hydrodynamic, salinity and sediment transport , and morphology change model CMS-Flow. The CMS- Flow uses the Finite Volume...and the influence of coastal structures. The implicit hydrodynamic model is coupled to a nonequilibrium transport model of multiple-sized total
Broad-band High-Frequency Sound Interaction With the Seafloor
1998-01-01
interface, propagation within and scattering from the seafloor. OBJECTIVES Resolution of modeling issues through experimental measurement of acoustic ...approximation, particularly the roughness scattering mechanism for propagating and evanescent waves, offer alternative models of the observed acoustic ...applicability of each model and it’s relative merits. The candidate models of acoustic penetration include: 1. Biot slow wave 2. Scattering of in-water
NASA Astrophysics Data System (ADS)
Chang, Ailian; Sun, HongGuang; Zheng, Chunmiao; Lu, Bingqing; Lu, Chengpeng; Ma, Rui; Zhang, Yong
2018-07-01
Fractional-derivative models have been developed recently to interpret various hydrologic dynamics, such as dissolved contaminant transport in groundwater. However, they have not been applied to quantify other fluid dynamics, such as gas transport through complex geological media. This study reviewed previous gas transport experiments conducted in laboratory columns and real-world oil-gas reservoirs and found that gas dynamics exhibit typical sub-diffusive behavior characterized by heavy late-time tailing in the gas breakthrough curves (BTCs), which cannot be effectively captured by classical transport models. Numerical tests and field applications of the time fractional convection-diffusion equation (fCDE) have shown that the fCDE model can capture the observed gas BTCs including their apparent positive skewness. Sensitivity analysis further revealed that the three parameters used in the fCDE model, including the time index, the convection velocity, and the diffusion coefficient, play different roles in interpreting the delayed gas transport dynamics. In addition, the model comparison and analysis showed that the time fCDE model is efficient in application. Therefore, the time fractional-derivative models can be conveniently extended to quantify gas transport through natural geological media such as complex oil-gas reservoirs.
Kumar, N.; Voulgaris, G.; Warner, John C.
2011-01-01
Regional Ocean Modeling System (ROMS v 3.0), a three-dimensional numerical ocean model, was previously enhanced for shallow water applications by including wave-induced radiation stress forcing provided through coupling to wave propagation models (SWAN, REF/DIF). This enhancement made it suitable for surf zone applications as demonstrated using examples of obliquely incident waves on a planar beach and rip current formation in longshore bar trough morphology (Haas and Warner, 2009). In this contribution, we present an update to the coupled model which implements a wave roller model and also a modified method of the radiation stress term based on Mellor (2008, 2011a,b,in press) that includes a vertical distribution which better simulates non-conservative (i.e., wave breaking) processes and appears to be more appropriate for sigma coordinates in very shallow waters where wave breaking conditions dominate. The improvements of the modified model are shown through simulations of several cases that include: (a) obliquely incident spectral waves on a planar beach; (b) obliquely incident spectral waves on a natural barred beach (DUCK'94 experiment); (c) alongshore variable offshore wave forcing on a planar beach; (d) alongshore varying bathymetry with constant offshore wave forcing; and (e) nearshore barred morphology with rip-channels. Quantitative and qualitative comparisons to previous analytical, numerical, laboratory studies and field measurements show that the modified model replicates surf zone recirculation patterns (onshore drift at the surface and undertow at the bottom) more accurately than previous formulations based on radiation stress (Haas and Warner, 2009). The results of the model and test cases are further explored for identifying the forces operating in rip current development and the potential implication for sediment transport and rip channel development. Also, model analysis showed that rip current strength is higher when waves approach at angles of 5?? to 10?? in comparison to normally incident waves. ?? 2011 Elsevier B.V.
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
NASA Astrophysics Data System (ADS)
Escalante, George
2017-05-01
Weak Value Measurements (WVMs) with pre- and post-selected quantum mechanical ensembles were proposed by Aharonov, Albert, and Vaidman in 1988 and have found numerous applications in both theoretical and applied physics. In the field of precision metrology, WVM techniques have been demonstrated and proven valuable as a means to shift, amplify, and detect signals and to make precise measurements of small effects in both quantum and classical systems, including: particle spin, the Spin-Hall effect of light, optical beam deflections, frequency shifts, field gradients, and many others. In principal, WVM amplification techniques are also possible in radar and could be a valuable tool for precision measurements. However, relatively limited research has been done in this area. This article presents a quantum-inspired model of radar range and range-rate measurements of arbitrary strength, including standard and pre- and post-selected measurements. The model is used to extend WVM amplification theory to radar, with the receive filter performing the post-selection role. It is shown that the description of range and range-rate measurements based on the quantum-mechanical measurement model and formalism produces the same results as the conventional approach used in radar based on signal processing and filtering of the reflected signal at the radar receiver. Numerical simulation results using simple point scatterrer configurations are presented, applying the quantum-inspired model of radar range and range-rate measurements that occur in the weak measurement regime. Potential applications and benefits of the quantum inspired approach to radar measurements are presented, including improved range and Doppler measurement resolution.
NASA Technical Reports Server (NTRS)
1972-01-01
A user's manual is provided for the environmental computer model proposed for the Richmond-Cape Henry Environmental Laboratory (RICHEL) application project for coastal zone land use investigations and marine resources management. The model was developed around the hydrologic cycle and includes two data bases consisting of climate and land use variables. The main program is described, along with control parameters to be set and pertinent subroutines.
Bidirectional RNN for Medical Event Detection in Electronic Health Records.
Jagannatha, Abhyuday N; Yu, Hong
2016-06-01
Sequence labeling for extraction of medical events and their attributes from unstructured text in Electronic Health Record (EHR) notes is a key step towards semantic understanding of EHRs. It has important applications in health informatics including pharmacovigilance and drug surveillance. The state of the art supervised machine learning models in this domain are based on Conditional Random Fields (CRFs) with features calculated from fixed context windows. In this application, we explored recurrent neural network frameworks and show that they significantly out-performed the CRF models.
Elements of active vibration control for rotating machinery
NASA Technical Reports Server (NTRS)
Ulbrich, Heinz
1990-01-01
The success or failure of active vibration control is determined by the availability of suitable actuators, modeling of the entire system including all active elements, positioning of the actuators and sensors, and implementation of problem-adapted control concepts. All of these topics are outlined and their special problems are discussed in detail. Special attention is given to efficient modeling of systems, especially for considering the active elements. Finally, design methods for and the application of active vibration control on rotating machinery are demonstrated by several real applications.
Binomial test statistics using Psi functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2007-01-01
For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.
Molina, Manuel; Mota, Manuel; Ramos, Alfonso
2015-01-01
This work deals with mathematical modeling through branching processes. We consider sexually reproducing animal populations where, in each generation, the number of progenitor couples is determined in a non-predictable environment. By using a class of two-sex branching processes, we describe their demographic dynamics and provide several probabilistic and inferential contributions. They include results about the extinction of the population and the estimation of the offspring distribution and its main moments. We also present an application to salmonid populations.
Synchronisation of chaos and its applications
NASA Astrophysics Data System (ADS)
Eroglu, Deniz; Lamb, Jeroen S. W.; Pereira, Tiago
2017-07-01
Dynamical networks are important models for the behaviour of complex systems, modelling physical, biological and societal systems, including the brain, food webs, epidemic disease in populations, power grids and many other. Such dynamical networks can exhibit behaviour in which deterministic chaos, exhibiting unpredictability and disorder, coexists with synchronisation, a classical paradigm of order. We survey the main theory behind complete, generalised and phase synchronisation phenomena in simple as well as complex networks and discuss applications to secure communications, parameter estimation and the anticipation of chaos.
Application of differential transformation method for solving dengue transmission mathematical model
NASA Astrophysics Data System (ADS)
Ndii, Meksianis Z.; Anggriani, Nursanti; Supriatna, Asep K.
2018-03-01
The differential transformation method (DTM) is a semi-analytical numerical technique which depends on Taylor series and has application in many areas including Biomathematics. The aim of this paper is to employ the differential transformation method (DTM) to solve system of non-linear differential equations for dengue transmission mathematical model. Analytical and numerical solutions are determined and the results are compared to that of Runge-Kutta method. We found a good agreement between DTM and Runge-Kutta method.
Recent Updates to the System Advisor Model (SAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiOrio, Nicholas A
The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with themore » detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.« less
Multi-application controls: Robust nonlinear multivariable aerospace controls applications
NASA Technical Reports Server (NTRS)
Enns, Dale F.; Bugajski, Daniel J.; Carter, John; Antoniewicz, Bob
1994-01-01
This viewgraph presentation describes the general methodology used to apply Honywell's Multi-Application Control (MACH) and the specific application to the F-18 High Angle-of-Attack Research Vehicle (HARV) including piloted simulation handling qualities evaluation. The general steps include insertion of modeling data for geometry and mass properties, aerodynamics, propulsion data and assumptions, requirements and specifications, e.g. definition of control variables, handling qualities, stability margins and statements for bandwidth, control power, priorities, position and rate limits. The specific steps include choice of independent variables for least squares fits to aerodynamic and propulsion data, modifications to the management of the controls with regard to integrator windup and actuation limiting and priorities, e.g. pitch priority over roll, and command limiting to prevent departures and/or undesirable inertial coupling or inability to recover to a stable trim condition. The HARV control problem is characterized by significant nonlinearities and multivariable interactions in the low speed, high angle-of-attack, high angular rate flight regime. Systematic approaches to the control of vehicle motions modeled with coupled nonlinear equations of motion have been developed. This paper will discuss the dynamic inversion approach which explicity accounts for nonlinearities in the control design. Multiple control effectors (including aerodynamic control surfaces and thrust vectoring control) and sensors are used to control the motions of the vehicles in several degrees-of-freedom. Several maneuvers will be used to illustrate performance of MACH in the high angle-of-attack flight regime. Analytical methods for assessing the robust performance of the multivariable control system in the presence of math modeling uncertainty, disturbances, and commands have reached a high level of maturity. The structured singular value (mu) frequency response methodology is presented as a method for analyzing robust performance and the mu-synthesis method will be presented as a method for synthesizing a robust control system. The paper concludes with the author's expectations regarding future applications of robust nonlinear multivariable controls.
Probabilistic arithmetic automata and their applications.
Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven
2012-01-01
We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.
Applications of the k – ω Model in Stellar Evolutionary Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yan, E-mail: ly@ynao.ac.cn
The k – ω model for turbulence was first proposed by Kolmogorov. A new k – ω model for stellar convection was developed by Li, which could reasonably describe turbulent convection not only in the convectively unstable zone, but also in the overshooting regions. We revised the k – ω model by improving several model assumptions (including the macro-length of turbulence, convective heat flux, and turbulent mixing diffusivity, etc.), making it applicable not only for convective envelopes, but also for convective cores. Eight parameters are introduced in the revised k – ω model. It should be noted that the Reynoldsmore » stress (turbulent pressure) is neglected in the equation of hydrostatic support. We applied it into solar models and 5 M {sub ⊙} stellar models to calibrate the eight model parameters, as well as to investigate the effects of the convective overshooting on the Sun and intermediate mass stellar models.« less
Computational Modeling of Space Physiology
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Griffin, Devon W.
2016-01-01
The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.
NASA Astrophysics Data System (ADS)
Wang, Mian
This thesis research is consist of four chapters, including biomimetic three-dimensional tissue engineered nanostructured bone model for breast cancer bone metastasis study (Chapter one), cold atmospheric plasma for selectively ablating metastatic breast cancer (Chapter two), design of biomimetic and bioactive cold plasma modified nanostructured scaffolds for enhanced osteogenic differentiation of bone marrow derived mesenchymal stem cells (Chapter three), and enhanced osteoblast and mesenchymal stem cell functions on titanium with hydrothermally treated nanocrystalline hydroxyapatite/magnetically treated carbon nanotubes for orthopedic applications (Chapter four). All the thesis research is focused on nanomaterials and the use of cold plasma technique for various biomedical applications.
NASA Astrophysics Data System (ADS)
Jensen, Kristoffer
2002-11-01
A timbre model is proposed for use in multiple applications. This model, which encompasses all voiced isolated musical instruments, has an intuitive parameter set, fixed size, and separates the sounds in dimensions akin to the timbre dimensions as proposed in timbre research. The analysis of the model parameters is fully documented, and it proposes, in particular, a method for the estimation of the difficult decay/release split-point. The main parameters of the model are the spectral envelope, the attack/release durations and relative amplitudes, and the inharmonicity and the shimmer and jitter (which provide both for the slow random variations of the frequencies and amplitudes, and also for additive noises). Some of the applications include synthesis, where a real-time application is being developed with an intuitive gui, classification, and search of sounds based on the content of the sounds, and a further understanding of acoustic musical instrument behavior. In order to present the background of the model, this presentation will start with sinusoidal A/S, some timbre perception research, then present the timbre model, show the validity for individual music instrument sounds, and finally introduce some expression additions to the model.
A case Study of Applying Object-Relational Persistence in Astronomy Data Archiving
NASA Astrophysics Data System (ADS)
Yao, S. S.; Hiriart, R.; Barg, I.; Warner, P.; Gasson, D.
2005-12-01
The NOAO Science Archive (NSA) team is developing a comprehensive domain model to capture the science data in the archive. Java and an object model derived from the domain model weil address the application layer of the archive system. However, since RDBMS is the best proven technology for data management, the challenge is the paradigm mismatch between the object and the relational models. Transparent object-relational mapping (ORM) persistence is a successful solution to this challenge. In the data modeling and persistence implementation of NSA, we are using Hibernate, a well-accepted ORM tool, to bridge the object model in the business tier and the relational model in the database tier. Thus, the database is isolated from the Java application. The application queries directly on objects using a DBMS-independent object-oriented query API, which frees the application developers from the low level JDBC and SQL so that they can focus on the domain logic. We present the detailed design of the NSA R3 (Release 3) data model and object-relational persistence, including mapping, retrieving and caching. Persistence layer optimization and performance tuning will be analyzed. The system is being built on J2EE, so the integration of Hibernate into the EJB container and the transaction management are also explored.
Application of particle and lattice codes to simulation of hydraulic fracturing
NASA Astrophysics Data System (ADS)
Damjanac, Branko; Detournay, Christine; Cundall, Peter A.
2016-04-01
With the development of unconventional oil and gas reservoirs over the last 15 years, the understanding and capability to model the propagation of hydraulic fractures in inhomogeneous and naturally fractured reservoirs has become very important for the petroleum industry (but also for some other industries like mining and geothermal). Particle-based models provide advantages over other models and solutions for the simulation of fracturing of rock masses that cannot be assumed to be continuous and homogeneous. It has been demonstrated (Potyondy and Cundall Int J Rock Mech Min Sci Geomech Abstr 41:1329-1364, 2004) that particle models based on a simple force criterion for fracture propagation match theoretical solutions and scale effects derived using the principles of linear elastic fracture mechanics (LEFM). The challenge is how to apply these models effectively (i.e., with acceptable models sizes and computer run times) to the coupled hydro-mechanical problems of relevant time and length scales for practical field applications (i.e., reservoir scale and hours of injection time). A formulation of a fully coupled hydro-mechanical particle-based model and its application to the simulation of hydraulic treatment of unconventional reservoirs are presented. Model validation by comparing with available analytical asymptotic solutions (penny-shape crack) and some examples of field application (e.g., interaction with DFN) are also included.
Exchange interactions in transition metal oxides: the role of oxygen spin polarization.
Logemann, R; Rudenko, A N; Katsnelson, M I; Kirilyuk, A
2017-08-23
Magnetism of transition metal (TM) oxides is usually described in terms of the Heisenberg model, with orientation-independent interactions between the spins. However, the applicability of such a model is not fully justified for TM oxides because spin polarization of oxygen is usually ignored. In the conventional model based on the Anderson principle, oxygen effects are considered as a property of the TM ion and only TM interactions are relevant. Here, we perform a systematic comparison between two approaches for spin polarization on oxygen in typical TM oxides. To this end, we calculate the exchange interactions in NiO, MnO and hematite (Fe 2 O 3 ) for different magnetic configurations using the magnetic force theorem. We consider the full spin Hamiltonian including oxygen sites, and also derive an effective model where the spin polarization on oxygen renormalizes the exchange interactions between TM sites. Surprisingly, the exchange interactions in NiO depend on the magnetic state if spin polarization on oxygen is neglected, resulting in non-Heisenberg behavior. In contrast, the inclusion of spin polarization in NiO makes the Heisenberg model more applicable. Just the opposite, MnO behaves as a Heisenberg magnet when oxygen spin polarization is neglected, but shows strong non-Heisenberg effects when spin polarization on oxygen is included. In hematite, both models result in non-Heisenberg behavior. The general applicability of the magnetic force theorem as well as the Heisenberg model to TM oxides is discussed.
Xu, Hanfu; O'Brochta, David A.
2015-01-01
Genetic technologies based on transposon-mediated transgenesis along with several recently developed genome-editing technologies have become the preferred methods of choice for genetically manipulating many organisms. The silkworm, Bombyx mori, is a Lepidopteran insect of great economic importance because of its use in silk production and because it is a valuable model insect that has greatly enhanced our understanding of the biology of insects, including many agricultural pests. In the past 10 years, great advances have been achieved in the development of genetic technologies in B. mori, including transposon-based technologies that rely on piggyBac-mediated transgenesis and genome-editing technologies that rely on protein- or RNA-guided modification of chromosomes. The successful development and application of these technologies has not only facilitated a better understanding of B. mori and its use as a silk production system, but also provided valuable experiences that have contributed to the development of similar technologies in non-model insects. This review summarizes the technologies currently available for use in B. mori, their application to the study of gene function and their use in genetically modifying B. mori for biotechnology applications. The challenges, solutions and future prospects associated with the development and application of genetic technologies in B. mori are also discussed. PMID:26108630
VRML Industry: Microcosms in the Making.
ERIC Educational Resources Information Center
Brown, Eric
1998-01-01
Discusses VRML (Virtual Reality Modeling Language) technology and some of its possible applications, including creating three-dimensional images on the Web, advertising, and data visualization in computer-assisted design and computer-assisted manufacturing (CAD/CAM). Future improvements are discussed, including streaming, database support, and…
Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation
Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.
2006-01-01
SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer software are available on the Web at http://usgs.er.gov/sparrow/sparrow-mod/.
On domain modelling of the service system with its application to enterprise information systems
NASA Astrophysics Data System (ADS)
Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.
2016-01-01
Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.
NASA Astrophysics Data System (ADS)
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
The Open Microscopy Environment: open image informatics for the biological sciences
NASA Astrophysics Data System (ADS)
Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.
2016-07-01
Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
Aerospace Applications of Magnetic Suspension Technology, part 2
NASA Technical Reports Server (NTRS)
Groom, Nelson J. (Editor); Britcher, Colin P. (Editor)
1991-01-01
In order to examine the state of technology of all areas of magnetic suspension with potential aerospace applications, and to review related recent developments in sensors and control approaches, superconducting technology, and design/implementation practices, a workshop was held at NASA-Langley. Areas of concern are pointing and isolation systems, microgravity and vibration isolation, bearing applications, wind tunnel model suspension systems, large gap magnetic suspension systems, controls, rotating machinery, science and applications of superconductivity, and sensors. Papers presented are included.
TDPAC and β-NMR applications in chemistry and biochemistry
NASA Astrophysics Data System (ADS)
Jancso, Attila; Correia, Joao G.; Gottberg, Alexander; Schell, Juliana; Stachura, Monika; Szunyogh, Dániel; Pallada, Stavroula; Lupascu, Doru C.; Kowalska, Magdalena; Hemmingsen, Lars
2017-06-01
Time differential perturbed angular correlation (TDPAC) of γ-rays spectroscopy has been applied in chemistry and biochemistry for decades. Herein we aim to present a comprehensive review of chemical and biochemical applications of TDPAC spectroscopy conducted at ISOLDE over the past 15 years, including elucidation of metal site structure and dynamics in proteins and model systems. β-NMR spectroscopy is well established in nuclear physics, solid state physics, and materials science, but only a limited number of applications in chemistry have appeared. Current endeavors at ISOLDE advancing applications of β-NMR towards chemistry and biochemistry are presented, including the first experiment on 31Mg2+ in an ionic liquid solution. Both techniques require the production of radioisotopes combined with advanced spectroscopic instrumentation present at ISOLDE.
Nanomechanics of carbon nanotubes
NASA Astrophysics Data System (ADS)
Ramasamy, Mouli; Kumar, Prashanth S.; Varadan, Vijay K.
2017-04-01
This review focusses on introducing the mechanics in carbon nanotubes (CNT), and the major applications of CNT and its composites in biomedicine. It emphasizes the nanomechanics of these materials by reviewing the widely followed experimental methods, theoretical models, simulations, classification, segregation and applications the aforementioned materials. First, several mechanical properties contributing to the classification of the CNT, for various biomedicine applications, are discussed in detail to provide a cursory glance at the uses of CNT. The mechanics of CNT discussed in this paper include: elasticity, stress, tension, compression, nano-scale mechanics. In addition to these basic properties, a brief introduction about nanoscale composites is given. Second, a brief review on some of the major applications of CNT in biomedicine including drug delivery, therapeutics, diagnostics and regenerative medicine is given.
Kent, Dea J
2010-01-01
I compared the effects of a just-in-time educational intervention (educational materials for dressing application attached to the manufacturer's dressing package) to traditional wound care education on reported confidence and dressing application in a simulated model. Nurses from a variety of backgrounds were recruited for this study. The nurses possessed all levels of education ranging from licensed practical nurse to master of science in nursing. Both novice and seasoned nurses were included, with no stipulations regarding years of nursing experience. Exclusion criteria included nurses who spent less than 50% of their time in direct patient care and nurses with advanced wound care training and/or certification (CWOCN, CWON). Study settings included community-based acute care facilities, critical access hospitals, long-term care facilities, long-term acute care facilities, and home care agencies. No level 1 trauma centers were included in the study for geographical reasons. Participants were randomly allocated to control or intervention groups. Each participant completed the Kent Dressing Confidence Assessment tool. Subjects were then asked to apply the dressing to a wound model under the observation of either the principal investigator or a trained observer, who scored the accuracy of dressing application according to established criteria. None of the 139 nurses who received traditional dressing packaging were able to apply the dressing to a wound model correctly. In contrast, 88% of the nurses who received the package with the educational guide attached to it were able to apply the dressing to a wound model correctly (χ2 = 107.22, df = 1, P = .0001). Nurses who received the dressing package with the attached educational guide agreed that this feature gave them confidence to correctly apply the dressing (88%), while no nurse agreed that the traditional package gave him or her the confidence to apply the dressing correctly (χ2 = 147.47, df = 4, P < .0001). A just-in-time education intervention improved nurses' confidence when applying an unfamiliar dressing and accuracy of application when applying the dressing to a simulated model compared to traditional wound care education.
GeoSciML v3.0 - a significant upgrade of the CGI-IUGS geoscience data model
NASA Astrophysics Data System (ADS)
Raymond, O.; Duclaux, G.; Boisvert, E.; Cipolloni, C.; Cox, S.; Laxton, J.; Letourneau, F.; Richard, S.; Ritchie, A.; Sen, M.; Serrano, J.-J.; Simons, B.; Vuollo, J.
2012-04-01
GeoSciML version 3.0 (http://www.geosciml.org), released in late 2011, is the latest version of the CGI-IUGS* Interoperability Working Group geoscience data interchange standard. The new version is a significant upgrade and refactoring of GeoSciML v2 which was released in 2008. GeoSciML v3 has already been adopted by several major international interoperability initiatives, including OneGeology, the EU INSPIRE program, and the US Geoscience Information Network, as their standard data exchange format for geoscience data. GeoSciML v3 makes use of recently upgraded versions of several Open Geospatial Consortium (OGC) and ISO data transfer standards, including GML v3.2, SWE Common v2.0, and Observations and Measurements v2 (ISO 19156). The GeoSciML v3 data model has been refactored from a single large application schema with many packages, into a number of smaller, but related, application schema modules with individual namespaces. This refactoring allows the use and future development of modules of GeoSciML (eg; GeologicUnit, GeologicStructure, GeologicAge, Borehole) in smaller, more manageable units. As a result of this refactoring and the integration with new OGC and ISO standards, GeoSciML v3 is not backwardly compatible with previous GeoSciML versions. The scope of GeoSciML has been extended in version 3.0 to include new models for geomorphological data (a Geomorphology application schema), and for geological specimens, geochronological interpretations, and metadata for geochemical and geochronological analyses (a LaboratoryAnalysis-Specimen application schema). In addition, there is better support for borehole data, and the PhysicalProperties model now supports a wider range of petrophysical measurements. The previously used CGI_Value data type has been superseded in favour of externally governed data types provided by OGC's SWE Common v2 and GML v3.2 data standards. The GeoSciML v3 release includes worked examples of best practice in delivering geochemical analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)
NASA Astrophysics Data System (ADS)
Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori
2016-07-01
This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.
Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori
2016-01-01
This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.
Uncertainty and variability in computational and mathematical models of cardiac physiology.
Mirams, Gary R; Pathmanathan, Pras; Gray, Richard A; Challenor, Peter; Clayton, Richard H
2016-12-01
Mathematical and computational models of cardiac physiology have been an integral component of cardiac electrophysiology since its inception, and are collectively known as the Cardiac Physiome. We identify and classify the numerous sources of variability and uncertainty in model formulation, parameters and other inputs that arise from both natural variation in experimental data and lack of knowledge. The impact of uncertainty on the outputs of Cardiac Physiome models is not well understood, and this limits their utility as clinical tools. We argue that incorporating variability and uncertainty should be a high priority for the future of the Cardiac Physiome. We suggest investigating the adoption of approaches developed in other areas of science and engineering while recognising unique challenges for the Cardiac Physiome; it is likely that novel methods will be necessary that require engagement with the mathematics and statistics community. The Cardiac Physiome effort is one of the most mature and successful applications of mathematical and computational modelling for describing and advancing the understanding of physiology. After five decades of development, physiological cardiac models are poised to realise the promise of translational research via clinical applications such as drug development and patient-specific approaches as well as ablation, cardiac resynchronisation and contractility modulation therapies. For models to be included as a vital component of the decision process in safety-critical applications, rigorous assessment of model credibility will be required. This White Paper describes one aspect of this process by identifying and classifying sources of variability and uncertainty in models as well as their implications for the application and development of cardiac models. We stress the need to understand and quantify the sources of variability and uncertainty in model inputs, and the impact of model structure and complexity and their consequences for predictive model outputs. We propose that the future of the Cardiac Physiome should include a probabilistic approach to quantify the relationship of variability and uncertainty of model inputs and outputs. © 2016 The Authors. The Journal of Physiology published by John Wiley & Sons Ltd on behalf of The Physiological Society.
Integral equations in the study of polar and ionic interaction site fluids
Howard, Jesse J.
2011-01-01
In this review article we consider some of the current integral equation approaches and application to model polar liquid mixtures. We consider the use of multidimensional integral equations and in particular progress on the theory and applications of three dimensional integral equations. The IEs we consider may be derived from equilibrium statistical mechanical expressions incorporating a classical Hamiltonian description of the system. We give example including salt solutions, inhomogeneous solutions and systems including proteins and nucleic acids. PMID:22383857
NASA Astrophysics Data System (ADS)
Aubé, M.; Simoneau, A.
2018-05-01
Illumina is one of the most physically detailed artificial night sky brightness model to date. It has been in continuous development since 2005 [1]. In 2016-17, many improvements were made to the Illumina code including an overhead cloud scheme, an improved blocking scheme for subgrid obstacles (trees and buildings), and most importantly, a full hyperspectral modeling approach. Code optimization resulted in significant reduction in execution time enabling users to run the model on standard personal computers for some applications. After describing the new schemes introduced in the model, we give some examples of applications for a peri-urban and a rural site both located inside the International Dark Sky reserve of Mont-Mégantic (QC, Canada).
NASA Astrophysics Data System (ADS)
Shephard, Adam M.; Thomas, Benjamin R.; Coble, Jamie B.; Wood, Houston G.
2018-05-01
This paper presents a development related to the use of minor isotope safeguards techniques (MIST) and the MSTAR cascade model as it relates to the application of international nuclear safeguards at gas centrifuge enrichment plants (GCEPs). The product of this paper is a derivation of the universal and dimensionless MSTAR cascade model. The new model can be used to calculate the minor uranium isotope concentrations in GCEP product and tails streams or to analyze, visualize, and interpret GCEP process data as part of MIST. Applications of the new model include the detection of undeclared feed and withdrawal streams at GCEPs when used in conjunction with UF6 sampling and/or other isotopic measurement techniques.
NASA Astrophysics Data System (ADS)
Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán
Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.
Townsend, Molly T; Sarigul-Klijn, Nesrin
2016-01-01
Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.
Novel application of ALMANAC: Modelling a functional group, exotic warm-season perennial grasses
USDA-ARS?s Scientific Manuscript database
Introduced perennial C4 grasses such buffelgrass (Pennisetum ciliare [(L.) Link]) and old world bluestems (OWB), including genera such as Bothriochloa Kuntze, Capillipedium Stapf, and Dichanthium Willemet have the potential to dominate landscapes. A process-based model that realistically simulates ...
The methods used for simulating aerosol physical and chemical processes in a new air pollution modeling system are discussed and analyzed. Such processes include emissions, nucleation, coagulation, reversible chemistry, condensation, dissolution, evaporation, irreversible chem...
Users manual for a one-dimensional Lagrangian transport model
Schoellhamer, D.H.; Jobson, H.E.
1986-01-01
A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
NASA Astrophysics Data System (ADS)
Bai, Hailong; Montési, Laurent G. J.; Behn, Mark D.
2017-01-01
MeltMigrator is a MATLAB®-based melt migration software developed to process three-dimensional mantle temperature and velocity data from user-supplied numerical models of mid-ocean ridges, calculate melt production and melt migration trajectories in the mantle, estimate melt flux along plate boundaries, and predict crustal thickness distribution on the seafloor. MeltMigrator is also capable of calculating compositional evolution depending on the choice of petrologic melting model. Programmed in modules, MeltMigrator is highly customizable and can be expanded to a wide range of applications. We have applied it to complex mid-ocean ridge model settings, including transform faults, oblique segments, ridge migration, asymmetrical spreading, background mantle flow, and ridge-plume interaction. In this technical report, we include an example application to a segmented mid-ocean ridge. MeltMigrator is available as a supplement to this paper, and it is also available from GitHub and the University of Maryland Geodynamics Group website.
NASA Technical Reports Server (NTRS)
Burgy, R. H.
1972-01-01
Data relating to hydrologic and water resource systems and subsystems management are reported. Systems models, user application, and remote sensing technology are covered. Parameters governing water resources include evaportranspiration, vegetation, precipitation, streams and estuaries, reservoirs and lakes, and unsaturate and saturated soil zones.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, James F.; Ho, Hing W.
1991-01-01
This report summarizes the development for: (1) correlation fields; (2) applications to liquid oxygen post; (3) models for pressure fluctuatios and vibration loads fluctuations; (4) additions to expert systems; and (5) scaling criteria. Implementation to computer code is also described. Demonstration sample cases are included with additional applications to engine duct and pipe bend.
NASA Astrophysics Data System (ADS)
Barbosa, N. A.; da Rosa, L. A. R.; Facure, A.; Braz, D.
2014-02-01
Concave eye applicators with 90Sr/90Y and 106Ru/106Rh beta-ray sources are usually used in brachytherapy for the treatment of superficial intraocular tumors as uveal melanoma with thickness up to 5 mm. The aim of this work consisted in using the Monte Carlo code MCNPX to calculate the 3D dose distribution on a mathematical model of the human eye, considering 90Sr/90Y and 160Ru/160Rh beta-ray eye applicators, in order to treat a posterior uveal melanoma with a thickness 3.8 mm from the choroid surface. Mathematical models were developed for the two ophthalmic applicators, CGD produced by BEBIG Company and SIA.6 produced by the Amersham Company, with activities 1 mCi and 4.23 mCi respectively. They have a concave form. These applicators' mathematical models were attached to the eye model and the dose distributions were calculated using the MCNPX *F8 tally. The average doses rates were determined in all regions of the eye model. The *F8 tally results showed that the deposited energy due to the applicator with the radionuclide 106Ru/106Rh is higher in all eye regions, including tumor. However the average dose rate in the tumor region is higher for the applicator with 90Sr/90Y, due to its high activity. Due to the dosimetric characteristics of these applicators, the PDD value for 3 mm water is 73% for the 106Ru/106Rh applicator and 60% for 90Sr/90Y applicator. For a better choice of the applicator type and radionuclide it is important to know the thickness of the tumor and its location.
NASA Astrophysics Data System (ADS)
Fernandez-del-Rincon, A.; Garcia, P.; Diez-Ibarbia, A.; de-Juan, A.; Iglesias, M.; Viadero, F.
2017-02-01
Gear transmissions remain as one of the most complex mechanical systems from the point of view of noise and vibration behavior. Research on gear modeling leading to the obtaining of models capable of accurately reproduce the dynamic behavior of real gear transmissions has spread out the last decades. Most of these models, although useful for design stages, often include simplifications that impede their application for condition monitoring purposes. Trying to filling this gap, the model presented in this paper allows us to simulate gear transmission dynamics including most of these features usually neglected by the state of the art models. This work presents a model capable of considering simultaneously the internal excitations due to the variable meshing stiffness (including the coupling among successive tooth pairs in contact, the non-linearity linked with the contacts between surfaces and the dissipative effects), and those excitations consequence of the bearing variable compliance (including clearances or pre-loads). The model can also simulate gear dynamics in a realistic torque dependent scenario. The proposed model combines a hybrid formulation for calculation of meshing forces with a non-linear variable compliance approach for bearings. Meshing forces are obtained by means of a double approach which combines numerical and analytical aspects. The methodology used provides a detailed description of the meshing forces, allowing their calculation even when gear center distance is modified due to shaft and bearing flexibilities, which are unavoidable in real transmissions. On the other hand, forces at bearing level were obtained considering a variable number of supporting rolling elements, depending on the applied load and clearances. Both formulations have been developed and applied to the simulation of the vibration of a sample transmission, focusing the attention on the transmitted load, friction meshing forces and bearing preloads.
Survey of methods for soil moisture determination
NASA Technical Reports Server (NTRS)
Schmugge, T. J.; Jackson, T. J.; Mckim, H. L.
1979-01-01
Existing and proposed methods for soil moisture determination are discussed. These include: (1) in situ investigations including gravimetric, nuclear, and electromagnetic techniques; (2) remote sensing approaches that use the reflected solar, thermal infrared, and microwave portions of the electromagnetic spectrum; and (3) soil physics models that track the behavior of water in the soil in response to meteorological inputs (precipitation) and demands (evapotranspiration). The capacities of these approaches to satisfy various user needs for soil moisture information vary from application to application, but a conceptual scheme for merging these approaches into integrated systems to provide soil moisture information is proposed that has the potential for meeting various application requirements.
SOCIB applications for oceanographic data management
NASA Astrophysics Data System (ADS)
Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Lora, Sebastian; March, David; Sebastian, Kristian; Tintoré, Joaquin
2015-04-01
The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that provides free, open and quality-controlled data from near-shore to the open sea. To collect the necessary data, the SOCIB system is made up of: a research vessel, a high-frequency (HF) radar system, weather stations, tide gauges, moorings, drifting buoys, ARGO profilers, and gliders (autonomous underwater vehicles). In addition, the system has recently begun incorporating oceanographic sensors attached to sea turtles. High-resolution numerical models provide forecast for hydrodynamics (ROMS) and waves (SAPO). According to SOCIB principles, data have to be: discoverable and accessible; freely available; interoperable, quality-controlled and standardized. The Data Centre (DC) manages the different steps of data processing, including: acquisition using SOCIB platforms (gliders, drifters, HF radar, ...), numerical models (hydrodynamics, waves, ...) or information generated by other data sources, distribution through dedicated web and mobile applications dynamic visualisation. The SOCIB DC constitutes an example of marine information systems within the framework of new coastal ocean observatories. In this work we present some of the applications developed for specific type of users, as well as the technologies used for their implementation: DAPP (Deployments application, http://apps.socib.es/dapp/), a web application to display information related to mobile platform trajectories. LW4NC2 (http://thredds.socib.es/lw4nc2), a web application for multidimensional (grid) data from NetCDF files (numerical models, HF radar). SACOSTA (http://gis.socib.es/sacosta), a viewer for cartographic data such as environmental sensitivity of the coastline. SEABOARD (http://seaboard.socib.es), a tool to disseminate SOCIB real time data to different types of users. Smart-phone apps to access data, platform trajectories and forecasts in real-time. In keeping with the objective of bringing relevant data to all kinds of users in a free and easy way, our future plans include the redesign of the applications to improve the user experience, along with the creation of applications specific to different groups of users, including tourists, sailors, surfers, and others.
2015-01-01
The Energy Information Administration (EIA) is investigating the potential benefits of incorporating interval electricity data into its residential energy end use models. This includes interval smart meter and submeter data from utility assets and systems. It is expected that these data will play a significant role in informing residential energy efficiency policies in the future. Therefore, a long-term strategy for improving the RECS end-use models will not be complete without an investigation of the current state of affairs of submeter data, including their potential for use in the context of residential building energy modeling.
Ground Operations Autonomous Control and Integrated Health Management
NASA Technical Reports Server (NTRS)
Daniels, James
2014-01-01
The Ground Operations Autonomous Control and Integrated Health Management plays a key role for future ground operations at NASA. The software that is integrated into this system is called G2 2011 Gensym. The purpose of this report is to describe the Ground Operations Autonomous Control and Integrated Health Management with the use of the G2 Gensym software and the G2 NASA toolkit for Integrated System Health Management (ISHM) which is a Computer Software Configuration Item (CSCI). The decision rationale for the use of the G2 platform is to develop a modular capability for ISHM and AC. Toolkit modules include knowledge bases that are generic and can be applied in any application domain module. That way, there's a maximization of reusability, maintainability, and systematic evolution, portability, and scalability. Engine modules are generic, while application modules represent the domain model of a specific application. Furthermore, the NASA toolkit, developed since 2006 (a set of modules), makes it possible to create application domain models quickly, using pre-defined objects that include sensors and components libraries for typical fluid, electrical, and mechanical systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. Lynn Watney; John H. Doveton
GEMINI (Geo-Engineering Modeling through Internet Informatics) is a public-domain web application focused on analysis and modeling of petroleum reservoirs and plays (http://www.kgs.ukans.edu/Gemini/index.html). GEMINI creates a virtual project by ''on-the-fly'' assembly and analysis of on-line data either from the Kansas Geological Survey or uploaded from the user. GEMINI's suite of geological and engineering web applications for reservoir analysis include: (1) petrofacies-based core and log modeling using an interactive relational rock catalog and log analysis modules; (2) a well profile module; (3) interactive cross sections to display ''marked'' wireline logs; (4) deterministic gridding and mapping of petrophysical data; (5) calculation and mappingmore » of layer volumetrics; (6) material balance calculations; (7) PVT calculator; (8) DST analyst, (9) automated hydrocarbon association navigator (KHAN) for database mining, and (10) tutorial and help functions. The Kansas Hydrocarbon Association Navigator (KHAN) utilizes petrophysical databases to estimate hydrocarbon pay or other constituent at a play- or field-scale. Databases analyzed and displayed include digital logs, core analysis and photos, DST, and production data. GEMINI accommodates distant collaborations using secure password protection and authorized access. Assembled data, analyses, charts, and maps can readily be moved to other applications. GEMINI's target audience includes small independents and consultants seeking to find, quantitatively characterize, and develop subtle and bypassed pays by leveraging the growing base of digital data resources. Participating companies involved in the testing and evaluation of GEMINI included Anadarko, BP, Conoco-Phillips, Lario, Mull, Murfin, and Pioneer Resources.« less
Validation and Trustworthiness of Multiscale Models of Cardiac Electrophysiology
Pathmanathan, Pras; Gray, Richard A.
2018-01-01
Computational models of cardiac electrophysiology have a long history in basic science applications and device design and evaluation, but have significant potential for clinical applications in all areas of cardiovascular medicine, including functional imaging and mapping, drug safety evaluation, disease diagnosis, patient selection, and therapy optimisation or personalisation. For all stakeholders to be confident in model-based clinical decisions, cardiac electrophysiological (CEP) models must be demonstrated to be trustworthy and reliable. Credibility, that is, the belief in the predictive capability, of a computational model is primarily established by performing validation, in which model predictions are compared to experimental or clinical data. However, there are numerous challenges to performing validation for highly complex multi-scale physiological models such as CEP models. As a result, credibility of CEP model predictions is usually founded upon a wide range of distinct factors, including various types of validation results, underlying theory, evidence supporting model assumptions, evidence from model calibration, all at a variety of scales from ion channel to cell to organ. Consequently, it is often unclear, or a matter for debate, the extent to which a CEP model can be trusted for a given application. The aim of this article is to clarify potential rationale for the trustworthiness of CEP models by reviewing evidence that has been (or could be) presented to support their credibility. We specifically address the complexity and multi-scale nature of CEP models which makes traditional model evaluation difficult. In addition, we make explicit some of the credibility justification that we believe is implicitly embedded in the CEP modeling literature. Overall, we provide a fresh perspective to CEP model credibility, and build a depiction and categorisation of the wide-ranging body of credibility evidence for CEP models. This paper also represents a step toward the extension of model evaluation methodologies that are currently being developed by the medical device community, to physiological models. PMID:29497385
Classical least squares multivariate spectral analysis
Haaland, David M.
2002-01-01
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
bioWidgets: data interaction components for genomics.
Fischer, S; Crabtree, J; Brunk, B; Gibson, M; Overton, G C
1999-10-01
The presentation of genomics data in a perspicuous visual format is critical for its rapid interpretation and validation. Relatively few public database developers have the resources to implement sophisticated front-end user interfaces themselves. Accordingly, these developers would benefit from a reusable toolkit of user interface and data visualization components. We have designed the bioWidget toolkit as a set of JavaBean components. It includes a wide array of user interface components and defines an architecture for assembling applications. The toolkit is founded on established software engineering design patterns and principles, including componentry, Model-View-Controller, factored models and schema neutrality. As a proof of concept, we have used the bioWidget toolkit to create three extendible applications: AnnotView, BlastView and AlignView.
James, Andrew J. A.; Konik, Robert M.; Lecheminant, Philippe; ...
2018-02-26
We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symme-tries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one andmore » two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1+1D quantum chro-modynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. Lastly, we describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Andrew J. A.; Konik, Robert M.; Lecheminant, Philippe
We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symme-tries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one andmore » two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1+1D quantum chro-modynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. Lastly, we describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.« less
NASA Astrophysics Data System (ADS)
James, Andrew J. A.; Konik, Robert M.; Lecheminant, Philippe; Robinson, Neil J.; Tsvelik, Alexei M.
2018-04-01
We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb–Liniger model, 1 + 1D quantum chromodynamics, as well as Landau–Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.
New developments in the theoretical treatment of low dimensional strongly correlated systems.
James, Andrew J A; Konik, Robert M; Lecheminant, Philippe; Robinson, Neil; Tsvelik, Alexei M
2017-10-09
We review two important non-perturbative approaches for extracting the physics of low- dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of confor- mal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symme- tries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1+1D quantum chro- modynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics. © 2017 IOP Publishing Ltd.
James, Andrew J A; Konik, Robert M; Lecheminant, Philippe; Robinson, Neil J; Tsvelik, Alexei M
2018-02-26
We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1 + 1D quantum chromodynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.
Unified constitutive models for high-temperature structural applications
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.
1988-01-01
Unified constitutive models are characterized by the use of a single inelastic strain rate term for treating all aspects of inelastic deformation, including plasticity, creep, and stress relaxation under monotonic or cyclic loading. The structure of this class of constitutive theory pertinent for high temperature structural applications is first outlined and discussed. The effectiveness of the unified approach for representing high temperature deformation of Ni-base alloys is then evaluated by extensive comparison of experimental data and predictions of the Bodner-Partom and the Walker models. The use of the unified approach for hot section structural component analyses is demonstrated by applying the Walker model in finite element analyses of a benchmark notch problem and a turbine blade problem.
Regression analysis of current-status data: an application to breast-feeding.
Grummer-strawn, L M
1993-09-01
"Although techniques for calculating mean survival time from current-status data are well known, their use in multiple regression models is somewhat troublesome. Using data on current breast-feeding behavior, this article considers a number of techniques that have been suggested in the literature, including parametric, nonparametric, and semiparametric models as well as the application of standard schedules. Models are tested in both proportional-odds and proportional-hazards frameworks....I fit [the] models to current status data on breast-feeding from the Demographic and Health Survey (DHS) in six countries: two African (Mali and Ondo State, Nigeria), two Asian (Indonesia and Sri Lanka), and two Latin American (Colombia and Peru)." excerpt
New generation of elastic network models.
López-Blanco, José Ramón; Chacón, Pablo
2016-04-01
The intrinsic flexibility of proteins and nucleic acids can be grasped from remarkably simple mechanical models of particles connected by springs. In recent decades, Elastic Network Models (ENMs) combined with Normal Model Analysis widely confirmed their ability to predict biologically relevant motions of biomolecules and soon became a popular methodology to reveal large-scale dynamics in multiple structural biology scenarios. The simplicity, robustness, low computational cost, and relatively high accuracy are the reasons behind the success of ENMs. This review focuses on recent advances in the development and application of ENMs, paying particular attention to combinations with experimental data. Successful application scenarios include large macromolecular machines, structural refinement, docking, and evolutionary conservation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Error Propagation in a System Model
NASA Technical Reports Server (NTRS)
Schloegel, Kirk (Inventor); Bhatt, Devesh (Inventor); Oglesby, David V. (Inventor); Madl, Gabor (Inventor)
2015-01-01
Embodiments of the present subject matter can enable the analysis of signal value errors for system models. In an example, signal value errors can be propagated through the functional blocks of a system model to analyze possible effects as the signal value errors impact incident functional blocks. This propagation of the errors can be applicable to many models of computation including avionics models, synchronous data flow, and Kahn process networks.
Rioland, Guillaume; Dutournié, Patrick; Faye, Delphine; Daou, T Jean; Patarin, Joël
2016-01-01
Zeolite pellets containing 5 wt % of binder (methylcellulose or sodium metasilicate) were formed with a hydraulic press. This paper describes a mathematical model to predict the mechanical properties (uniaxial and diametric compression) of these pellets for arbitrary dimensions (height and diameter) using a design of experiments (DOE) methodology. A second-degree polynomial equation including interactions was used to approximate the experimental results. This leads to an empirical model for the estimation of the mechanical properties of zeolite pellets with 5 wt % of binder. The model was verified by additional experimental tests including pellets of different dimensions created with different applied pressures. The optimum dimensions were found to be a diameter of 10-23 mm, a height of 1-3.5 mm and an applied pressure higher than 200 MPa. These pellets are promising for technological uses in molecular decontamination for aerospace-based applications.
Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.
2017-01-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830
Alterations to the relativistic Love-Franey model and their application to inelastic scattering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeile, J.R.
The fictitious axial-vector and tensor mesons for the real part of the relativistic Love-Franey interaction are removed. In an attempt to make up for this loss, derivative couplings are used for the {pi} and {rho} mesons. Such derivative couplings require the introduction of axial-vector and tensor contact term corrections. Meson parameters are then fit to free nucleon-nucleon scattering data. The resulting fits are comparable to those of the relativistic Love-Franey model provided that the contact term corrections are included and the fits are weighted over the physically significant quantity of twice the tensor minus the axial-vector Lorentz invariants. Failure tomore » include contact term corrections leads to poor fits at higher energies. The off-shell behavior of this model is then examined by looking at several applications from inelastic proton-nucleus scattering.« less
NASA Astrophysics Data System (ADS)
Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.
2016-11-01
The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.
A quantitative model of application slow-down in multi-resource shared systems
Lim, Seung-Hwan; Kim, Youngjae
2016-12-26
Scheduling multiple jobs onto a platform enhances system utilization by sharing resources. The benefits from higher resource utilization include reduced cost to construct, operate, and maintain a system, which often include energy consumption. Maximizing these benefits comes at a price-resource contention among jobs increases job completion time. In this study, we analyze slow-downs of jobs due to contention for multiple resources in a system; referred to as dilation factor. We observe that multiple-resource contention creates non-linear dilation factors of jobs. From this observation, we establish a general quantitative model for dilation factors of jobs in multi-resource systems. A job ismore » characterized by a vector-valued loading statistics and dilation factors of a job set are given by a quadratic function of their loading vectors. We demonstrate how to systematically characterize a job, maintain the data structure to calculate the dilation factor (loading matrix), and calculate the dilation factor of each job. We validate the accuracy of the model with multiple processes running on a native Linux server, virtualized servers, and with multiple MapReduce workloads co-scheduled in a cluster. Evaluation with measured data shows that the D-factor model has an error margin of less than 16%. We extended the D-factor model to capture the slow-down of applications when multiple identical resources exist such as multi-core environments and multi-disks environments. Finally, validation results of the extended D-factor model with HPC checkpoint applications on the parallel file systems show that D-factor accurately captures the slow down of concurrent applications in such environments.« less
The Search for Efficiency in Arboreal Ray Tracing Applications
NASA Astrophysics Data System (ADS)
van Leeuwen, M.; Disney, M.; Chen, J. M.; Gomez-Dans, J.; Kelbe, D.; van Aardt, J. A.; Lewis, P.
2016-12-01
Forest structure significantly impacts a range of abiotic conditions, including humidity and the radiation regime, all of which affect the rate of net and gross primary productivity. Current forest productivity models typically consider abstract media to represent the transfer of radiation within the canopy. Examples include the representation forest structure via a layered canopy model, where leaf area and inclination angles are stratified with canopy depth, or as turbid media where leaves are randomly distributed within space or within confined geometric solids such as blocks, spheres or cones. While these abstract models are known to produce accurate estimates of primary productivity at the stand level, their limited geometric resolution restricts applicability at fine spatial scales, such as the cell, leaf or shoot levels, thereby not addressing the full potential of assimilation of data from laboratory and field measurements with that of remote sensing technology. Recent research efforts have explored the use of laser scanning to capture detailed tree morphology at millimeter accuracy. These data can subsequently be used to combine ray tracing with primary productivity models, providing an ability to explore trade-offs among different morphological traits or assimilate data from spatial scales, spanning the leaf- to the stand level. Ray tracing has a major advantage of allowing the most accurate structural description of the canopy, and can directly exploit new 3D structural measurements, e.g., from laser scanning. However, the biggest limitation of ray tracing models is their high computational cost, which currently limits their use for large-scale applications. In this talk, we explore ways to more efficiently exploit ray tracing simulations and capture this information in a readily computable form for future evaluation, thus potentially enabling large-scale first-principles forest growth modelling applications.