Sample records for performance model developed

  1. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  2. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  3. Ku-Band rendezvous radar performance computer simulation model

    NASA Astrophysics Data System (ADS)

    Magnusson, H. G.; Goff, M. F.

    1984-06-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  4. Research and development on performance models of thermal imaging systems

    NASA Astrophysics Data System (ADS)

    Wang, Ji-hui; Jin, Wei-qi; Wang, Xia; Cheng, Yi-nan

    2009-07-01

    Traditional ACQUIRE models perform the discrimination tasks of detection (target orientation, recognition and identification) for military target based upon minimum resolvable temperature difference (MRTD) and Johnson criteria for thermal imaging systems (TIS). Johnson criteria is generally pessimistic for performance predict of sampled imager with the development of focal plane array (FPA) detectors and digital image process technology. Triangle orientation discrimination threshold (TOD) model, minimum temperature difference perceived (MTDP)/ thermal range model (TRM3) Model and target task performance (TTP) metric have been developed to predict the performance of sampled imager, especially TTP metric can provides better accuracy than the Johnson criteria. In this paper, the performance models above are described; channel width metrics have been presented to describe the synthesis performance including modulate translate function (MTF) channel width for high signal noise to ration (SNR) optoelectronic imaging systems and MRTD channel width for low SNR TIS; the under resolvable questions for performance assessment of TIS are indicated; last, the development direction of performance models for TIS are discussed.

  5. Electric Propulsion System Modeling for the Proposed Prometheus 1 Mission

    NASA Technical Reports Server (NTRS)

    Fiehler, Douglas; Dougherty, Ryan; Manzella, David

    2005-01-01

    The proposed Prometheus 1 spacecraft would utilize nuclear electric propulsion to propel the spacecraft to its ultimate destination where it would perform its primary mission. As part of the Prometheus 1 Phase A studies, system models were developed for each of the spacecraft subsystems that were integrated into one overarching system model. The Electric Propulsion System (EPS) model was developed using data from the Prometheus 1 electric propulsion technology development efforts. This EPS model was then used to provide both performance and mass information to the Prometheus 1 system model for total system trades. Development of the EPS model is described, detailing both the performance calculations as well as its evolution over the course of Phase A through three technical baselines. Model outputs are also presented, detailing the performance of the model and its direct relationship to the Prometheus 1 technology development efforts. These EP system model outputs are also analyzed chronologically showing the response of the model development to the four technical baselines during Prometheus 1 Phase A.

  6. The Development of Web-Based Collaborative Training Model for Enhancing Human Performances on ICT for Students in Banditpattanasilpa Institute

    ERIC Educational Resources Information Center

    Pumipuntu, Natawut; Kidrakarn, Pachoen; Chetakarn, Somchock

    2015-01-01

    This research aimed to develop the model of Web-based Collaborative (WBC) Training model for enhancing human performances on ICT for students in Banditpattanasilpa Institute. The research is divided into three phases: 1) investigating students and teachers' training needs on ICT web-based contents and performance, 2) developing a web-based…

  7. Development of a model to assess environmental performance, concerning HSE-MS principles.

    PubMed

    Abbaspour, M; Hosseinzadeh Lotfi, F; Karbassi, A R; Roayaei, E; Nikoomaram, H

    2010-06-01

    The main objective of the present study was to develop a valid and appropriate model to evaluate companies' efficiency and environmental performance, concerning health, safety, and environmental management system principles. The proposed model overcomes the shortcomings of the previous models developed in this area. This model has been designed on the basis of a mathematical method known as Data Envelopment Analysis (DEA). In order to differentiate high-performing companies from weak ones, one of DEA nonradial models named as enhanced Russell graph efficiency measure has been applied. Since some of the environmental performance indicators cannot be controlled by companies' managers, it was necessary to develop the model in a way that it could be applied when discretionary and/or nondiscretionary factors were involved. The model, then, has been modified on a real case that comprised 12 oil and gas general contractors. The results showed the relative efficiency, inefficiency sources, and the rank of contractors.

  8. Development of cost-effective pavement treatment selection and treatment performance models : [tech summary].

    DOT National Transportation Integrated Search

    2015-09-01

    The overall objective of this study was to develop pavement treatment performance : models in support of cost-e ective selection of pavement treatment type, project : boundaries, and time of treatment. The development of the proposed models was ba...

  9. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  10. Prognostic models for complete recovery in ischemic stroke: a systematic review and meta-analysis.

    PubMed

    Jampathong, Nampet; Laopaiboon, Malinee; Rattanakanokchai, Siwanon; Pattanittum, Porjai

    2018-03-09

    Prognostic models have been increasingly developed to predict complete recovery in ischemic stroke. However, questions arise about the performance characteristics of these models. The aim of this study was to systematically review and synthesize performance of existing prognostic models for complete recovery in ischemic stroke. We searched journal publications indexed in PUBMED, SCOPUS, CENTRAL, ISI Web of Science and OVID MEDLINE from inception until 4 December, 2017, for studies designed to develop and/or validate prognostic models for predicting complete recovery in ischemic stroke patients. Two reviewers independently examined titles and abstracts, and assessed whether each study met the pre-defined inclusion criteria and also independently extracted information about model development and performance. We evaluated validation of the models by medians of the area under the receiver operating characteristic curve (AUC) or c-statistic and calibration performance. We used a random-effects meta-analysis to pool AUC values. We included 10 studies with 23 models developed from elderly patients with a moderately severe ischemic stroke, mainly in three high income countries. Sample sizes for each study ranged from 75 to 4441. Logistic regression was the only analytical strategy used to develop the models. The number of various predictors varied from one to 11. Internal validation was performed in 12 models with a median AUC of 0.80 (95% CI 0.73 to 0.84). One model reported good calibration. Nine models reported external validation with a median AUC of 0.80 (95% CI 0.76 to 0.82). Four models showed good discrimination and calibration on external validation. The pooled AUC of the two validation models of the same developed model was 0.78 (95% CI 0.71 to 0.85). The performance of the 23 models found in the systematic review varied from fair to good in terms of internal and external validation. Further models should be developed with internal and external validation in low and middle income countries.

  11. Development of cost-effective pavement treatment selection and treatment performance models : research project capsule.

    DOT National Transportation Integrated Search

    2010-09-10

    The overall goal of this study is to develop pavement treatment performance models in support of the : cost-effective selection of pavement treatment types, project boundaries, and time of treatment. The : development of the proposed models will be b...

  12. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  13. Development and Integration of Control System Models

    NASA Technical Reports Server (NTRS)

    Kim, Young K.

    1998-01-01

    The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.

  14. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  15. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  16. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  17. University Library Strategy Development: A Conceptual Model of Researcher Performance to Inform Service Delivery

    ERIC Educational Resources Information Center

    Maddox, Alexia; Zhao, Linlin

    2017-01-01

    This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…

  18. Modeling Human Steering Behavior During Path Following in Teleoperation of Unmanned Ground Vehicles.

    PubMed

    Mirinejad, Hossein; Jayakumar, Paramsothy; Ersal, Tulga

    2018-04-01

    This paper presents a behavioral model representing the human steering performance in teleoperated unmanned ground vehicles (UGVs). Human steering performance in teleoperation is considerably different from the performance in regular onboard driving situations due to significant communication delays in teleoperation systems and limited information human teleoperators receive from the vehicle sensory system. Mathematical models capturing the teleoperation performance are a key to making the development and evaluation of teleoperated UGV technologies fully simulation based and thus more rapid and cost-effective. However, driver models developed for the typical onboard driving case do not readily address this need. To fill the gap, this paper adopts a cognitive model that was originally developed for a typical highway driving scenario and develops a tuning strategy that adjusts the model parameters in the absence of human data to reflect the effect of various latencies and UGV speeds on driver performance in a teleoperated path-following task. Based on data collected from a human subject test study, it is shown that the tuned model can predict both the trend of changes in driver performance for different driving conditions and the best steering performance of human subjects in all driving conditions considered. The proposed model with the tuning strategy has a satisfactory performance in predicting human steering behavior in the task of teleoperated path following of UGVs. The established model is a suited candidate to be used in place of human drivers for simulation-based studies of UGV mobility in teleoperation systems.

  19. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  20. Grizzly Staus Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Zhang, Yongfeng; Chakraborty, Pritam

    2014-09-01

    This report summarizes work during FY 2014 to develop capabilities to predict embrittlement of reactor pressure vessel steel, and to assess the response of embrittled reactor pressure vessels to postulated accident conditions. This work has been conducted a three length scales. At the engineering scale, 3D fracture mechanics capabilities have been developed to calculate stress intensities and fracture toughnesses, to perform a deterministic assessment of whether a crack would propagate at the location of an existing flaw. This capability has been demonstrated on several types of flaws in a generic reactor pressure vessel model. Models have been developed at themore » scale of fracture specimens to develop a capability to determine how irradiation affects the fracture toughness of material. Verification work has been performed on a previously-developed model to determine the sensitivity of the model to specimen geometry and size effects. The effects of irradiation on the parameters of this model has been investigated. At lower length scales, work has continued in an ongoing to understand how irradiation and thermal aging affect the microstructure and mechanical properties of reactor pressure vessel steel. Previously-developed atomistic kinetic monte carlo models have been further developed and benchmarked against experimental data. Initial work has been performed to develop models of nucleation in a phase field model. Additional modeling work has also been performed to improve the fundamental understanding of the formation mechanisms and stability of matrix defects caused.« less

  1. A model for evaluating the social performance of construction waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan Hongping, E-mail: hpyuan2005@gmail.com

    Highlights: Black-Right-Pointing-Pointer Scant attention is paid to social performance of construction waste management (CWM). Black-Right-Pointing-Pointer We develop a model for assessing the social performance of CWM. Black-Right-Pointing-Pointer With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamicsmore » (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.« less

  2. Development of index based pavement performance models for pavement management system (PMS) of LADOTD.

    DOT National Transportation Integrated Search

    2009-03-01

    This report focuses on pavement performance and treatment models for Louisiana Department of : Transportation and Development (LADOTD) and is in continuation of Louisiana Transportation : Research Center (LTRC) Report No. 430 Development of Unifor...

  3. The development of performance prediction models for Virginia's interstate highway system.

    DOT National Transportation Integrated Search

    1995-01-01

    Performance prediction models are a key component of any well-designed pavement management system. In this study, data compiled from the condition surveys conducted annually on Virginia's pavement network were used to develop prediction models for mo...

  4. Rotorcraft performance data for AEDT : Methods of using the NASA Design and Analysis of Rotorcraft tool for developing data for AEDT's Rotorcraft Performance Model

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents use of the NASA Design and Analysis of Rotorcraft (NDARC) helicopter performance software tool in developing data for the FAAs Aviation Environmental Design Tool (AEDT). These data support the Rotorcraft Performance Model (RP...

  5. Designing Performance Measurement For Supply Chain's Actors And Regulator Using Scale Balanced Scorecard And Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Kusrini, Elisa; Subagyo; Aini Masruroh, Nur

    2016-01-01

    This research is a sequel of the author's earlier conducted researches in the fields of designing of integrated performance measurement between supply chain's actors and regulator. In the previous paper, the design of performance measurement is done by combining Balanced Scorecard - Supply Chain Operation Reference - Regulator Contribution model and Data Envelopment Analysis. This model referred as B-S-Rc-DEA model. The combination has the disadvantage that all the performance variables have the same weight. This paper investigates whether by giving weight to performance variables will produce more sensitive performance measurement in detecting performance improvement. Therefore, this paper discusses the development of the model B-S-Rc-DEA by giving weight to its performance'variables. This model referred as Scale B-S-Rc-DEA model. To illustrate the model of development, some samples from small medium enterprises of leather craft industry supply chain in province of Yogyakarta, Indonesia are used in this research. It is found that Scale B-S-Rc-DEA model is more sensitive to detecting performance improvement than B-S- Rc-DEA model.

  6. Effects of thermal blooming on systems comprised of tiled subapertures

    NASA Astrophysics Data System (ADS)

    Leakeas, Charles L.; Bartell, Richard J.; Krizo, Matthew J.; Fiorino, Steven T.; Cusumano, Salvatore J.; Whiteley, Matthew R.

    2010-04-01

    Laser weapon systems comprise of tiled subapertures are rapidly emerging in the directed energy community. The Air Force Institute of Technology Center for Directed Energy (AFIT/CDE), under sponsorship of the HEL Joint Technology Office has developed performance models of such laser weapon system configurations consisting of tiled arrays of both slab and fiber subapertures. These performance models are based on results of detailed waveoptics analyses conducted using WaveTrain. Previous performance model versions developed in this effort represent system characteristics such as subaperture shape, aperture fill factor, subaperture intensity profile, subaperture placement in the primary aperture, subaperture mutual coherence (piston), subaperture differential jitter (tilt), and beam quality wave-front error associated with each subaperture. The current work is a prerequisite for the development of robust performance models for turbulence and thermal blooming effects for tiled systems. Emphasis is placed on low altitude tactical scenarios. The enhanced performance model developed will be added to AFIT/CDE's HELEEOS parametric one-on-one engagement level model via the Scaling for High Energy Laser and Relay Engagement (SHaRE) toolbox.

  7. Electrochemical carbon dioxide concentrator subsystem math model. [for manned space station

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Carlson, J. N.; Schubert, F. H.

    1974-01-01

    A steady state computer simulation model has been developed to describe the performance of a total six man, self-contained electrochemical carbon dioxide concentrator subsystem built for the space station prototype. The math model combines expressions describing the performance of the electrochemical depolarized carbon dioxide concentrator cells and modules previously developed with expressions describing the performance of the other major CS-6 components. The model is capable of accurately predicting CS-6 performance over EDC operating ranges and the computer simulation results agree with experimental data obtained over the prediction range.

  8. Model for Predicting the Performance of Planetary Suit Hip Bearing Designs

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Margerum, Sarah; Hharvill, Lauren; Rajulu, Sudhakar

    2012-01-01

    Designing a space suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. During the development period of the suit numerous design iterations need to occur before the hardware meets human performance requirements. Using computer models early in the design phase of hardware development is advantageous, by allowing virtual prototyping to take place. A virtual design environment allows designers to think creatively, exhaust design possibilities, and study design impacts on suit and human performance. A model of the rigid components of the Mark III Technology Demonstrator Suit (planetary-type space suit) and a human manikin were created and tested in a virtual environment. The performance of the Mark III hip bearing model was first developed and evaluated virtually by comparing the differences in mobility performance between the nominal bearing configurations and modified bearing configurations. Suited human performance was then simulated with the model and compared to actual suited human performance data using the same bearing configurations. The Mark III hip bearing model was able to visually represent complex bearing rotations and the theoretical volumetric ranges of motion in three dimensions. The model was also able to predict suited human hip flexion and abduction maximums to within 10% of the actual suited human subject data, except for one modified bearing condition in hip flexion which was off by 24%. Differences between the model predictions and the human subject performance data were attributed to the lack of joint moment limits in the model, human subject fitting issues, and the limited suit experience of some of the subjects. The results demonstrate that modeling space suit rigid segments is a feasible design tool for evaluating and optimizing suited human performance. Keywords: space suit, design, modeling, performance

  9. Risk assessment model for development of advanced age-related macular degeneration.

    PubMed

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  10. Review of numerical models to predict cooling tower performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, B.M.; Nomura, K.K.; Bartz, J.A.

    1987-01-01

    Four state-of-the-art computer models developed to predict the thermal performance of evaporative cooling towers are summarized. The formulation of these models, STAR and TEFERI (developed in Europe) and FACTS and VERA2D (developed in the U.S.), is summarized. A fifth code, based on Merkel analysis, is also discussed. Principal features of the codes, computation time and storage requirements are described. A discussion of model validation is also provided.

  11. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  12. [Development and Application of a Performance Prediction Model for Home Care Nursing Based on a Balanced Scorecard using the Bayesian Belief Network].

    PubMed

    Noh, Wonjung; Seomun, Gyeongae

    2015-06-01

    This study was conducted to develop key performance indicators (KPIs) for home care nursing (HCN) based on a balanced scorecard, and to construct a performance prediction model of strategic objectives using the Bayesian Belief Network (BBN). This methodological study included four steps: establishment of KPIs, performance prediction modeling, development of a performance prediction model using BBN, and simulation of a suggested nursing management strategy. An HCN expert group and a staff group participated. The content validity index was analyzed using STATA 13.0, and BBN was analyzed using HUGIN 8.0. We generated a list of KPIs composed of 4 perspectives, 10 strategic objectives, and 31 KPIs. In the validity test of the performance prediction model, the factor with the greatest variance for increasing profit was maximum cost reduction of HCN services. The factor with the smallest variance for increasing profit was a minimum image improvement for HCN. During sensitivity analysis, the probability of the expert group did not affect the sensitivity. Furthermore, simulation of a 10% image improvement predicted the most effective way to increase profit. KPIs of HCN can estimate financial and non-financial performance. The performance prediction model for HCN will be useful to improve performance.

  13. Models of Marine Fish Biodiversity: Assessing Predictors from Three Habitat Classification Schemes.

    PubMed

    Yates, Katherine L; Mellin, Camille; Caley, M Julian; Radford, Ben T; Meeuwig, Jessica J

    2016-01-01

    Prioritising biodiversity conservation requires knowledge of where biodiversity occurs. Such knowledge, however, is often lacking. New technologies for collecting biological and physical data coupled with advances in modelling techniques could help address these gaps and facilitate improved management outcomes. Here we examined the utility of environmental data, obtained using different methods, for developing models of both uni- and multivariate biodiversity metrics. We tested which biodiversity metrics could be predicted best and evaluated the performance of predictor variables generated from three types of habitat data: acoustic multibeam sonar imagery, predicted habitat classification, and direct observer habitat classification. We used boosted regression trees (BRT) to model metrics of fish species richness, abundance and biomass, and multivariate regression trees (MRT) to model biomass and abundance of fish functional groups. We compared model performance using different sets of predictors and estimated the relative influence of individual predictors. Models of total species richness and total abundance performed best; those developed for endemic species performed worst. Abundance models performed substantially better than corresponding biomass models. In general, BRT and MRTs developed using predicted habitat classifications performed less well than those using multibeam data. The most influential individual predictor was the abiotic categorical variable from direct observer habitat classification and models that incorporated predictors from direct observer habitat classification consistently outperformed those that did not. Our results show that while remotely sensed data can offer considerable utility for predictive modelling, the addition of direct observer habitat classification data can substantially improve model performance. Thus it appears that there are aspects of marine habitats that are important for modelling metrics of fish biodiversity that are not fully captured by remotely sensed data. As such, the use of remotely sensed data to model biodiversity represents a compromise between model performance and data availability.

  14. Models of Marine Fish Biodiversity: Assessing Predictors from Three Habitat Classification Schemes

    PubMed Central

    Yates, Katherine L.; Mellin, Camille; Caley, M. Julian; Radford, Ben T.; Meeuwig, Jessica J.

    2016-01-01

    Prioritising biodiversity conservation requires knowledge of where biodiversity occurs. Such knowledge, however, is often lacking. New technologies for collecting biological and physical data coupled with advances in modelling techniques could help address these gaps and facilitate improved management outcomes. Here we examined the utility of environmental data, obtained using different methods, for developing models of both uni- and multivariate biodiversity metrics. We tested which biodiversity metrics could be predicted best and evaluated the performance of predictor variables generated from three types of habitat data: acoustic multibeam sonar imagery, predicted habitat classification, and direct observer habitat classification. We used boosted regression trees (BRT) to model metrics of fish species richness, abundance and biomass, and multivariate regression trees (MRT) to model biomass and abundance of fish functional groups. We compared model performance using different sets of predictors and estimated the relative influence of individual predictors. Models of total species richness and total abundance performed best; those developed for endemic species performed worst. Abundance models performed substantially better than corresponding biomass models. In general, BRT and MRTs developed using predicted habitat classifications performed less well than those using multibeam data. The most influential individual predictor was the abiotic categorical variable from direct observer habitat classification and models that incorporated predictors from direct observer habitat classification consistently outperformed those that did not. Our results show that while remotely sensed data can offer considerable utility for predictive modelling, the addition of direct observer habitat classification data can substantially improve model performance. Thus it appears that there are aspects of marine habitats that are important for modelling metrics of fish biodiversity that are not fully captured by remotely sensed data. As such, the use of remotely sensed data to model biodiversity represents a compromise between model performance and data availability. PMID:27333202

  15. Final Report - IHLW PCT, Spinel T1%, Electrical Conductivity, and Viscosity Model Development, VSL-07R1240-4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Piepel, Gregory F.; Landmesser, S. M.

    2013-11-13

    This report is the last in a series of currently scheduled reports that presents the results from the High Level Waste (HLW) glass formulation development and testing work performed at the Vitreous State Laboratory (VSL) of the Catholic University of America (CUA) and the development of IHLW property-composition models performed jointly by Pacific Northwest National Laboratory (PNNL) and VSL for the River Protection Project-Waste Treatment and Immobilization Plant (RPP-WTP). Specifically, this report presents results of glass testing at VSL and model development at PNNL for Product Consistency Test (PCT), one-percent crystal fraction temperature (T1%), electrical conductivity (EC), and viscosity ofmore » HLW glasses. The models presented in this report may be augmented and additional validation work performed during any future immobilized HLW (IHLW) model development work. Completion of the test objectives is addressed.« less

  16. Thermal performance modeling of NASA s scientific balloons

    NASA Astrophysics Data System (ADS)

    Franco, H.; Cathey, H.

    The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. The development of the radiative environment and program input files, the development of the modeling techniques for balloons, and the development of appropriate data output handling techniques for both the raw data and data plots will be discussed. A general guideline to match predicted balloon performance with known flight data will also be presented. One long-term goal of this effort is to develop simplified approaches and techniques to include results in performance codes being developed.

  17. Performance Models for the Spike Banded Linear System Solver

    DOE PAGES

    Manguoglu, Murat; Saied, Faisal; Sameh, Ahmed; ...

    2011-01-01

    With availability of large-scale parallel platforms comprised of tens-of-thousands of processors and beyond, there is significant impetus for the development of scalable parallel sparse linear system solvers and preconditioners. An integral part of this design process is the development of performance models capable of predicting performance and providing accurate cost models for the solvers and preconditioners. There has been some work in the past on characterizing performance of the iterative solvers themselves. In this paper, we investigate the problem of characterizing performance and scalability of banded preconditioners. Recent work has demonstrated the superior convergence properties and robustness of banded preconditioners,more » compared to state-of-the-art ILU family of preconditioners as well as algebraic multigrid preconditioners. Furthermore, when used in conjunction with efficient banded solvers, banded preconditioners are capable of significantly faster time-to-solution. Our banded solver, the Truncated Spike algorithm is specifically designed for parallel performance and tolerance to deep memory hierarchies. Its regular structure is also highly amenable to accurate performance characterization. Using these characteristics, we derive the following results in this paper: (i) we develop parallel formulations of the Truncated Spike solver, (ii) we develop a highly accurate pseudo-analytical parallel performance model for our solver, (iii) we show excellent predication capabilities of our model – based on which we argue the high scalability of our solver. Our pseudo-analytical performance model is based on analytical performance characterization of each phase of our solver. These analytical models are then parameterized using actual runtime information on target platforms. An important consequence of our performance models is that they reveal underlying performance bottlenecks in both serial and parallel formulations. All of our results are validated on diverse heterogeneous multiclusters – platforms for which performance prediction is particularly challenging. Finally, we provide predict the scalability of the Spike algorithm using up to 65,536 cores with our model. In this paper we extend the results presented in the Ninth International Symposium on Parallel and Distributed Computing.« less

  18. Testing algorithms for a passenger train braking performance model.

    DOT National Transportation Integrated Search

    2011-09-01

    "The Federal Railroad Administrations Office of Research and Development funded a project to establish performance model to develop, analyze, and test positive train control (PTC) braking algorithms for passenger train operations. With a good brak...

  19. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  20. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainer, Leo I.; Hoeschele, Marc A.; Apte, Michael G.

    This report addresses the results of detailed monitoring completed under Program Element 6 of Lawrence Berkeley National Laboratory's High Performance Commercial Building Systems (HPCBS) PIER program. The purpose of the Energy Simulations and Projected State-Wide Energy Savings project is to develop reasonable energy performance and cost models for high performance relocatable classrooms (RCs) across California climates. A key objective of the energy monitoring was to validate DOE2 simulations for comparison to initial DOE2 performance projections. The validated DOE2 model was then used to develop statewide savings projections by modeling base case and high performance RC operation in the 16 Californiamore » climate zones. The primary objective of this phase of work was to utilize detailed field monitoring data to modify DOE2 inputs and generate performance projections based on a validated simulation model. Additional objectives include the following: (1) Obtain comparative performance data on base case and high performance HVAC systems to determine how they are operated, how they perform, and how the occupants respond to the advanced systems. This was accomplished by installing both HVAC systems side-by-side (i.e., one per module of a standard two module, 24 ft by 40 ft RC) on the study RCs and switching HVAC operating modes on a weekly basis. (2) Develop projected statewide energy and demand impacts based on the validated DOE2 model. (3) Develop cost effectiveness projections for the high performance HVAC system in the 16 California climate zones.« less

  2. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  3. An Evaluation Model for Competency Based Teacher Preparatory Programs.

    ERIC Educational Resources Information Center

    Denton, Jon J.

    This discussion describes an evaluation model designed to complement a curriculum development project, the primary goal of which is to structure a performance based program for preservice teachers. Data collected from the implementation of this four-phase model can be used to make decisions for developing and changing performance objectives and…

  4. Structural-Thermal-Optical-Performance (STOP) Model Development and Analysis of a Field-widened Michelson Interferometer

    NASA Technical Reports Server (NTRS)

    Scola, Salvatore J.; Osmundsen, James F.; Murchison, Luke S.; Davis, Warren T.; Fody, Joshua M.; Boyer, Charles M.; Cook, Anthony L.; Hostetler, Chris A.; Seaman, Shane T.; Miller, Ian J.; hide

    2014-01-01

    An integrated Structural-Thermal-Optical-Performance (STOP) model was developed for a field-widened Michelson interferometer which is being built and tested for the High Spectral Resolution Lidar (HSRL) project at NASA Langley Research Center (LaRC). The performance of the interferometer is highly sensitive to thermal expansion, changes in refractive index with temperature, temperature gradients, and deformation due to mounting stresses. Hand calculations can only predict system performance for uniform temperature changes, under the assumption that coefficient of thermal expansion (CTE) mismatch effects are negligible. An integrated STOP model was developed to investigate the effects of design modifications on the performance of the interferometer in detail, including CTE mismatch, and other three- dimensional effects. The model will be used to improve the design for a future spaceflight version of the interferometer. The STOP model was developed using the Comet SimApp'TM' Authoring Workspace which performs automated integration between Pro-Engineer®, Thermal Desktop®, MSC Nastran'TM', SigFit'TM', Code V'TM', and MATLAB®. This is the first flight project for which LaRC has utilized Comet, and it allows a larger trade space to be studied in a shorter time than would be possible in a traditional STOP analysis. This paper describes the development of the STOP model, presents a comparison of STOP results for simple cases with hand calculations, and presents results of the correlation effort to bench-top testing of the interferometer. A trade study conducted with the STOP model which demonstrates a few simple design changes that can improve the performance seen in the lab is also presented.

  5. Development of the NASA Digital Astronaut Project Muscle Model

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Pennline, James A.; Thompson, W. K.; Humphreys, B. T.; Ryder, J. W.; Ploutz-Snyder, L. L.; Mulugeta, L.

    2015-01-01

    This abstract describes development work performed on the NASA Digital Astronaut Project Muscle Model. Muscle atrophy is a known physiological response to exposure to a low gravity environment. The DAP muscle model computationally predicts the change in muscle structure and function vs. time in a reduced gravity environment. The spaceflight muscle model can then be used in biomechanical models of exercise countermeasures and spaceflight tasks to: 1) develop site specific bone loading input to the DAP bone adaptation model over the course of a mission; 2) predict astronaut performance of spaceflight tasks; 3) inform effectiveness of new exercise countermeasures concepts.

  6. Evaluating Internal Model Strength and Performance of Myoelectric Prosthesis Control Strategies.

    PubMed

    Shehata, Ahmed W; Scheme, Erik J; Sensinger, Jonathon W

    2018-05-01

    On-going developments in myoelectric prosthesis control have provided prosthesis users with an assortment of control strategies that vary in reliability and performance. Many studies have focused on improving performance by providing feedback to the user but have overlooked the effect of this feedback on internal model development, which is key to improve long-term performance. In this paper, the strength of internal models developed for two commonly used myoelectric control strategies: raw control with raw feedback (using a regression-based approach) and filtered control with filtered feedback (using a classifier-based approach), were evaluated using two psychometric measures: trial-by-trial adaptation and just-noticeable difference. The performance of both strategies was also evaluated using Schmidt's style target acquisition task. Results obtained from 24 able-bodied subjects showed that although filtered control with filtered feedback had better short-term performance in path efficiency ( ), raw control with raw feedback resulted in stronger internal model development ( ), which may lead to better long-term performance. Despite inherent noise in the control signals of the regression controller, these findings suggest that rich feedback associated with regression control may be used to improve human understanding of the myoelectric control system.

  7. A Multiscale Virtual Fabrication and Lattice Modeling Approach for the Fatigue Performance Prediction of Asphalt Concrete

    NASA Astrophysics Data System (ADS)

    Dehghan Banadaki, Arash

    Predicting the ultimate performance of asphalt concrete under realistic loading conditions is the main key to developing better-performing materials, designing long-lasting pavements, and performing reliable lifecycle analysis for pavements. The fatigue performance of asphalt concrete depends on the mechanical properties of the constituent materials, namely asphalt binder and aggregate. This dependent link between performance and mechanical properties is extremely complex, and experimental techniques often are used to try to characterize the performance of hot mix asphalt. However, given the seemingly uncountable number of mixture designs and loading conditions, it is simply not economical to try to understand and characterize the material behavior solely by experimentation. It is well known that analytical and computational modeling methods can be combined with experimental techniques to reduce the costs associated with understanding and characterizing the mechanical behavior of the constituent materials. This study aims to develop a multiscale micromechanical lattice-based model to predict cracking in asphalt concrete using component material properties. The proposed algorithm, while capturing different phenomena for different scales, also minimizes the need for laboratory experiments. The developed methodology builds on a previously developed lattice model and the viscoelastic continuum damage model to link the component material properties to the mixture fatigue performance. The resulting lattice model is applied to predict the dynamic modulus mastercurves for different scales. A framework for capturing the so-called structuralization effects is introduced that significantly improves the accuracy of the modulus prediction. Furthermore, air voids are added to the model to help capture this important micromechanical feature that affects the fatigue performance of asphalt concrete as well as the modulus value. The effects of rate dependency are captured by implementing the viscoelastic fracture criterion. In the end, an efficient cyclic loading framework is developed to evaluate the damage accumulation in the material that is caused by long-sustained cyclic loads.

  8. Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs

    NASA Astrophysics Data System (ADS)

    Harvey, David Benjamin Paul

    A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.

  9. Assessing the feasibility, cost, and utility of developing models of human performance in aviation

    NASA Technical Reports Server (NTRS)

    Stillwell, William

    1990-01-01

    The purpose of the effort outlined in this briefing was to determine whether models exist or can be developed that can be used to address aviation automation issues. A multidisciplinary team has been assembled to undertake this effort, including experts in human performance, team/crew, and aviation system modeling, and aviation data used as input to such models. The project consists of two phases, a requirements assessment phase that is designed to determine the feasibility and utility of alternative modeling efforts, and a model development and evaluation phase that will seek to implement the plan (if a feasible cost effective development effort is found) that results from the first phase. Viewgraphs are given.

  10. Loss model for off-design performance analysis of radial turbines with pivoting-vane, variable-area stators

    NASA Technical Reports Server (NTRS)

    Meitner, P. L.; Glassman, A. J.

    1980-01-01

    An off-design performance loss model is developed for variable-area (pivoted vane) radial turbines. The variation in stator loss with stator area is determined by a viscous loss model while the variation in rotor loss due to stator area variation (for no stator end-clearance gap) is determined through analytical matching of experimental data. An incidence loss model is also based on matching of the experimental data. A stator vane end-clearance leakage model is developed and sample calculations are made to show the predicted effects of stator vane end-clearance leakage on performance.

  11. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  12. A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules

    PubMed Central

    Ramakrishnan, Sridhar; Wesensten, Nancy J.; Balkin, Thomas J.; Reifman, Jaques

    2016-01-01

    Study Objectives: Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss—from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges—and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. Methods: We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. Results: The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. Conclusions: The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. Citation: Ramakrishnan S, Wesensten NJ, Balkin TJ, Reifman J. A unified model of performance: validation of its predictions across different sleep/wake schedules. SLEEP 2016;39(1):249–262. PMID:26518594

  13. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  14. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1977-01-01

    Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.

  15. Compensator development and examination of performance and robustness

    NASA Technical Reports Server (NTRS)

    1985-01-01

    This research focuses on the development of compensators to control the mean square surface error of a wraprib antenna. The methodology is as follows: A model of appropriate size and structure is developed by looking at the convergence of functional gains for control and estimation. Then an LQG compensator is designed using this model. Finally, the compensator is simplified using balanced realization theory. In the conventional approach for compensator design, there is no mechanism for ensuring that the model is adequate for designing a compensator which will achieve the desired level of performance. It is shown here that both the model order and compensator order are directly related to the closed loop performance requirements for the system.

  16. Longitudinal predictors of aerobic performance in adolescent soccer players.

    PubMed

    Valente-dos-Santos, João; Coelho-e-Silva, Manuel J; Duarte, João; Figueiredo, António J; Liparotti, João R; Sherar, Lauren B; Elferink-Gemser, Marije T; Malina, Robert M

    2012-01-01

    The importance of aerobic performance in youth soccer is well established. The aim of the present study was to evaluate the contributions of chronological age (CA), skeletal age (SA), body size, and training to the longitudinal development of aerobic performance in youth male soccer players aged 10 to 18 years. Players (n=83) were annually followed up during 5 years, resulting in an average of 4.4 observations per player. Decimal CA was calculated, and SA, stature, body weight, and aerobic performance were measured once per year. Fat-free mass (FFM) was estimated from age- and gender-specific anthropometric formulas, and annual volume training was recorded. After testing for multicollinearity, multilevel regression modeling was used to analyze the longitudinal data aligned by CA and SA (Model 1 and 2, respectively) and to develop aerobic performance scores. The following equations provide estimations of the aerobic performance for young soccer players: ŷ(Model 1 [deviance from the null model =388.50; P<0.01]) =57.75+9.06×centered CA-0.57×centered CA(2)+0.03×annual volume training and ŷ(Model 2 [deviance from the null model=327.98; P<0.01])=13.03+4.04×centered SA-0.12×centered SA(2)+0.99×FFM+0.03×annual volume training. The development of aerobic performance in young soccer players was found to be significantly related to CA, biological development, and volume of training.

  17. Modeling of nitrate concentration in groundwater using artificial intelligence approach--a case study of Gaza coastal aquifer.

    PubMed

    Alagha, Jawad S; Said, Md Azlin Md; Mogheir, Yunes

    2014-01-01

    Nitrate concentration in groundwater is influenced by complex and interrelated variables, leading to great difficulty during the modeling process. The objectives of this study are (1) to evaluate the performance of two artificial intelligence (AI) techniques, namely artificial neural networks and support vector machine, in modeling groundwater nitrate concentration using scant input data, as well as (2) to assess the effect of data clustering as a pre-modeling technique on the developed models' performance. The AI models were developed using data from 22 municipal wells of the Gaza coastal aquifer in Palestine from 2000 to 2010. Results indicated high simulation performance, with the correlation coefficient and the mean average percentage error of the best model reaching 0.996 and 7 %, respectively. The variables that strongly influenced groundwater nitrate concentration were previous nitrate concentration, groundwater recharge, and on-ground nitrogen load of each land use land cover category in the well's vicinity. The results also demonstrated the merit of performing clustering of input data prior to the application of AI models. With their high performance and simplicity, the developed AI models can be effectively utilized to assess the effects of future management scenarios on groundwater nitrate concentration, leading to more reasonable groundwater resources management and decision-making.

  18. The Audience Performs: A Phenomenological Model for Criticism of Oral Interpretation Performance.

    ERIC Educational Resources Information Center

    Langellier, Kristin M.

    Richard Lanigan's phenomenology of human communication is applicable to the development of a model for critiquing oral interpretation performance. This phenomenological model takes conscious experience of the relationship of a person and the lived-world as its data base, and assumes a phenomenology of performance which creates text in the triadic…

  19. On Lack of Robustness in Hydrological Model Development Due to Absence of Guidelines for Selecting Calibration and Evaluation Data: Demonstration for Data-Driven Models

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao

    2018-02-01

    Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.

  20. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective, the suited performance trends were comparable between the model and the suited subjects. With the three off-nominal bearing configurations compared to the nominal bearing configurations, human subjects showed decreases in hip flexion of 64%, 6%, and 13% and in hip abduction of 59%, 2%, and 20%. Likewise the solid model showed decreases in hip flexion of 58%, 1%, and 25% and in hip abduction of 56%, 0%, and 30%, under the same condition changes from the nominal configuration. Differences seen between the model predictions and the human subject performance data could be attributed to the model lacking dynamic elements and performing kinematic analysis only, the level of fit of the subjects with the suit, the levels of the subject s suit experience.

  1. External validation of a prediction model for surgical site infection after thoracolumbar spine surgery in a Western European cohort.

    PubMed

    Janssen, Daniël M C; van Kuijk, Sander M J; d'Aumerie, Boudewijn B; Willems, Paul C

    2018-05-16

    A prediction model for surgical site infection (SSI) after spine surgery was developed in 2014 by Lee et al. This model was developed to compute an individual estimate of the probability of SSI after spine surgery based on the patient's comorbidity profile and invasiveness of surgery. Before any prediction model can be validly implemented in daily medical practice, it should be externally validated to assess how the prediction model performs in patients sampled independently from the derivation cohort. We included 898 consecutive patients who underwent instrumented thoracolumbar spine surgery. To quantify overall performance using Nagelkerke's R 2 statistic, the discriminative ability was quantified as the area under the receiver operating characteristic curve (AUC). We computed the calibration slope of the calibration plot, to judge prediction accuracy. Sixty patients developed an SSI. The overall performance of the prediction model in our population was poor: Nagelkerke's R 2 was 0.01. The AUC was 0.61 (95% confidence interval (CI) 0.54-0.68). The estimated slope of the calibration plot was 0.52. The previously published prediction model showed poor performance in our academic external validation cohort. To predict SSI after instrumented thoracolumbar spine surgery for the present population, a better fitting prediction model should be developed.

  2. Simulating the role of visual selective attention during the development of perceptual completion

    PubMed Central

    Schlesinger, Matthew; Amso, Dima; Johnson, Scott P.

    2014-01-01

    We recently proposed a multi-channel, image-filtering model for simulating the development of visual selective attention in young infants (Schlesinger, Amso & Johnson, 2007). The model not only captures the performance of 3-month-olds on a visual search task, but also implicates two cortical regions that may play a role in the development of visual selective attention. In the current simulation study, we used the same model to simulate 3-month-olds’ performance on a second measure, the perceptual unity task. Two parameters in the model – corresponding to areas in the occipital and parietal cortices – were systematically varied while the gaze patterns produced by the model were recorded and subsequently analyzed. Three key findings emerged from the simulation study. First, the model successfully replicated the performance of 3-month-olds on the unity perception task. Second, the model also helps to explain the improved performance of 2-month-olds when the size of the occluder in the unity perception task is reduced. Third, in contrast to our previous simulation results, variation in only one of the two cortical regions simulated (i.e. recurrent activity in posterior parietal cortex) resulted in a performance pattern that matched 3-month-olds. These findings provide additional support for our hypothesis that the development of perceptual completion in early infancy is promoted by progressive improvements in visual selective attention and oculomotor skill. PMID:23106728

  3. Simulating the role of visual selective attention during the development of perceptual completion.

    PubMed

    Schlesinger, Matthew; Amso, Dima; Johnson, Scott P

    2012-11-01

    We recently proposed a multi-channel, image-filtering model for simulating the development of visual selective attention in young infants (Schlesinger, Amso & Johnson, 2007). The model not only captures the performance of 3-month-olds on a visual search task, but also implicates two cortical regions that may play a role in the development of visual selective attention. In the current simulation study, we used the same model to simulate 3-month-olds' performance on a second measure, the perceptual unity task. Two parameters in the model - corresponding to areas in the occipital and parietal cortices - were systematically varied while the gaze patterns produced by the model were recorded and subsequently analyzed. Three key findings emerged from the simulation study. First, the model successfully replicated the performance of 3-month-olds on the unity perception task. Second, the model also helps to explain the improved performance of 2-month-olds when the size of the occluder in the unity perception task is reduced. Third, in contrast to our previous simulation results, variation in only one of the two cortical regions simulated (i.e. recurrent activity in posterior parietal cortex) resulted in a performance pattern that matched 3-month-olds. These findings provide additional support for our hypothesis that the development of perceptual completion in early infancy is promoted by progressive improvements in visual selective attention and oculomotor skill. © 2012 Blackwell Publishing Ltd.

  4. A Public-Private Partnership Develops and Externally Validates a 30-Day Hospital Readmission Risk Prediction Model

    PubMed Central

    Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat

    2013-01-01

    Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068

  5. Performance Testing of a Trace Contaminant Control Subassembly for the International Space Station

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Curtis, R. E.; Alexandre, K. L.; Ruggiero, L. L.; Shtessel, N.

    1998-01-01

    As part of the International Space Station (ISS) Trace Contaminant Control Subassembly (TCCS) development, a performance test has been conducted to provide reference data for flight verification analyses. This test, which used the U.S. Habitation Module (U.S. Hab) TCCS as the test article, was designed to add to the existing database on TCCS performance. Included in this database are results obtained during ISS development testing; testing of functionally similar TCCS prototype units; and bench scale testing of activated charcoal, oxidation catalyst, and granular lithium hydroxide (LiOH). The present database has served as the basis for the development and validation of a computerized TCCS process simulation model. This model serves as the primary means for verifying the ISS TCCS performance. In order to mitigate risk associated with this verification approach, the U.S. Hab TCCS performance test provides an additional set of data which serve to anchor both the process model and previously-obtained development test data to flight hardware performance. The following discussion provides relevant background followed by a summary of the test hardware, objectives, requirements, and facilities. Facility and test article performance during the test is summarized, test results are presented, and the TCCS's performance relative to past test experience is discussed. Performance predictions made with the TCCS process model are compared with the U.S. Hab TCCS test results to demonstrate its validation.

  6. Current practices in pavement performance modeling project 08-03 (C07) : task 4 report final summary of findings.

    DOT National Transportation Integrated Search

    2010-02-26

    In anticipation of developing pavement performance models as part of a proposed pavement management : system, the Pennsylvania Department of Transportation (PennDOT) initiated a study in 2009 to investigate : performance modeling activities and condi...

  7. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  8. Validating Human Performance Models of the Future Orion Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.; Walters, Brett; Fairey, Lisa

    2010-01-01

    NASA's Orion Crew Exploration Vehicle (CEV) will provide transportation for crew and cargo to and from destinations in support of the Constellation Architecture Design Reference Missions. Discrete Event Simulation (DES) is one of the design methods NASA employs for crew performance of the CEV. During the early development of the CEV, NASA and its prime Orion contractor Lockheed Martin (LM) strived to seek an effective low-cost method for developing and validating human performance DES models. This paper focuses on the method developed while creating a DES model for the CEV Rendezvous, Proximity Operations, and Docking (RPOD) task to the International Space Station. Our approach to validation was to attack the problem from several fronts. First, we began the development of the model early in the CEV design stage. Second, we adhered strictly to M&S development standards. Third, we involved the stakeholders, NASA astronauts, subject matter experts, and NASA's modeling and simulation development community throughout. Fourth, we applied standard and easy-to-conduct methods to ensure the model's accuracy. Lastly, we reviewed the data from an earlier human-in-the-loop RPOD simulation that had different objectives, which provided us an additional means to estimate the model's confidence level. The results revealed that a majority of the DES model was a reasonable representation of the current CEV design.

  9. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  10. Modeling of NASA's 30/20 GHz satellite communications system

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Maples, B. W.; Stevens, G. A.

    1984-01-01

    NASA is in the process of developing technology for a 30/20 GHz satellite communications link. Currently hardware is being assembled for a test transponder. A simulation package is being developed to study the link performance in the presence of interference and noise. This requires developing models for the components of the system. This paper describes techniques used to model the components for which data is available. Results of experiments performed using these models are described. A brief overview of NASA's 30/20 GHz communications satellite program is also included.

  11. Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking

    NASA Technical Reports Server (NTRS)

    Turgeon, Gregory; Price, Petra

    2010-01-01

    A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.

  12. A model for evaluating the social performance of construction waste management.

    PubMed

    Yuan, Hongping

    2012-06-01

    It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Estuarine modeling: Does a higher grid resolution improve model performance?

    EPA Science Inventory

    Ecological models are useful tools to explore cause effect relationships, test hypothesis and perform management scenarios. A mathematical model, the Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to the Louisiana continental shelf of the northern ...

  14. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The primary tasks performed are: (1) the development of a second order local thermodynamic nonequilibrium (LTNE) model for atoms; (2) the continued development of vibrational nonequilibrium models; and (3) the development of a new multicomponent diffusion model. In addition, studies comparing these new models with previous models and results were conducted and reported.

  15. FNAS/summer faculty fellowship research continuation program. Task 6: Integrated model development for liquid fueled rocket propulsion systems. Task 9: Aspects of model-based rocket engine condition monitoring and control

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael; Helmicki, Arthur J.

    1993-01-01

    The objective of Phase I of this research effort was to develop an advanced mathematical-empirical model of SSME steady-state performance. Task 6 of Phase I is to develop component specific modification strategy for baseline case influence coefficient matrices. This report describes the background of SSME performance characteristics and provides a description of the control variable basis of three different gains models. The procedure used to establish influence coefficients for each of these three models is also described. Gains model analysis results are compared to Rocketdyne's power balance model (PBM).

  16. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    NASA Astrophysics Data System (ADS)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  17. A critical evaluation of various turbulence models as applied to internal fluid flows

    NASA Technical Reports Server (NTRS)

    Nallasamy, M.

    1985-01-01

    Models employed in the computation of turbulent flows are described and their application to internal flows is evaluated by examining the predictions of various turbulence models in selected flow configurations. The main conclusions are: (1) the k-epsilon model is used in a majority of all the two-dimensional flow calculations reported in the literature; (2) modified forms of the k-epsilon model improve the performance for flows with streamline curvature and heat transfer; (3) for flows with swirl, the k-epsilon model performs rather poorly; the algebraic stress model performs better in this case; and (4) for flows with regions of secondary flow (noncircular duct flows), the algebraic stress model performs fairly well for fully developed flow, for developing flow, the algebraic stress model performance is not good; a Reynolds stress model should be used. False diffusion and inlet boundary conditions are discussed. Countergradient transport and its implications in turbulence modeling is mentioned. Two examples of recirculating flow predictions obtained using PHOENICS code are discussed. The vortex method, large eddy simulation (modeling of subgrid scale Reynolds stresses), and direct simulation, are considered. Some recommendations for improving the model performance are made. The need for detailed experimental data in flows with strong curvature is emphasized.

  18. Autonomous Aerobraking: Thermal Analysis and Response Surface Development

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Thornblom, Mark N.

    2011-01-01

    A high-fidelity thermal model of the Mars Reconnaissance Orbiter was developed for use in an autonomous aerobraking simulation study. Response surface equations were derived from the high-fidelity thermal model and integrated into the autonomous aerobraking simulation software. The high-fidelity thermal model was developed using the Thermal Desktop software and used in all phases of the analysis. The use of Thermal Desktop exclusively, represented a change from previously developed aerobraking thermal analysis methodologies. Comparisons were made between the Thermal Desktop solutions and those developed for the previous aerobraking thermal analyses performed on the Mars Reconnaissance Orbiter during aerobraking operations. A variable sensitivity screening study was performed to reduce the number of variables carried in the response surface equations. Thermal analysis and response surface equation development were performed for autonomous aerobraking missions at Mars and Venus.

  19. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  20. Comparison of DNQ/novolac resists for e-beam exposure

    NASA Astrophysics Data System (ADS)

    Fedynyshyn, Theodore H.; Doran, Scott P.; Lind, Michele L.; Lyszczarz, Theodore M.; DiNatale, William F.; Lennon, Donna; Sauer, Charles A.; Meute, Jeff

    1999-12-01

    We have surveyed the commercial resist market with the dual purpose of identifying diazoquinone/novolac based resists that have potential for use as e-beam mask making resists and baselining these resists for comparison against future mask making resist candidates. For completeness, this survey would require that each resist be compared with an optimized developer and development process. To accomplish this task in an acceptable time period, e-beam lithography modeling was employed to quickly identify the resist and developer combinations that lead to superior resist performance. We describe the verification of a method to quickly screen commercial i-line resists with different developers, by determining modeling parameters for i-line resists from e-beam exposures, modeling the resist performance, and comparing predicted performance versus actual performance. We determined the lithographic performance of several DNQ/novolac resists whose modeled performance suggests that sensitivities of less than 40 (mu) C/cm2 coupled with less than 10-nm CD change per percent change in dose are possible for target 600-nm features. This was accomplished by performing a series of statistically designed experiments on the leading resists candidates to optimize processing variables, followed by comparing experimentally determined resist sensitivities, latitudes, and profiles of the DNQ/novolac resists a their optimized process.

  1. The Talent Development Middle School Model: Context, Components, and Initial Impacts on Students' Performance and Attendance

    ERIC Educational Resources Information Center

    Herlihy, Corinne M.; Kemple, James J.

    2004-01-01

    The Talent Development Middle School model was created to make a difference in struggling urban middle schools. The model is part of a trend in school improvement strategies whereby whole-school reform projects aim to improve performance and attendance outcomes for students through the use of major changes in both the organizational structure and…

  2. Applying model abstraction techniques to optimize monitoring networks for detecting subsurface contaminant transport

    USDA-ARS?s Scientific Manuscript database

    Improving strategies for monitoring subsurface contaminant transport includes performance comparison of competing models, developed independently or obtained via model abstraction. Model comparison and parameter discrimination involve specific performance indicators selected to better understand s...

  3. A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules.

    PubMed

    Ramakrishnan, Sridhar; Wesensten, Nancy J; Balkin, Thomas J; Reifman, Jaques

    2016-01-01

    Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss-from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges-and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. © 2016 Associated Professional Sleep Societies, LLC.

  4. The impact of a freshman academy on science performance of first-time ninth-grade students at one Georgia high school

    NASA Astrophysics Data System (ADS)

    Daniel, Vivian Summerour

    The purpose of this within-group experimental study was to find out to what extent ninth-grade students improved their science performance beyond their middle school science performance at one Georgia high school utilizing a freshman academy model. Freshman academies have been recognized as a useful tool for increasing academic performance among ninth-grade students because they address a range of academic support initiatives tailored to improve academic performance among ninth-grade students. The talent development model developed by Legters, Balfanz, Jordan, and McPartland (2002) has served as a foundational standard for many ninth grade academy programs. A cornerstone feature of this model is the creation of small learning communities used to increase ninth-grade student performance. Another recommendation was to offer credit recovery opportunities for ninth graders along with creating parent and community involvement activities to increase academic success among ninth-grade students. While the site's program included some of the initiatives outlined by the talent development model, it did not utilize all of them. The study concluded that the academy did not show a definitive increase in academic performance among ninth-grade students since most students stayed within their original performance category.

  5. Administrator self-ratings of organization capacity and performance of healthy community development projects in Taiwan.

    PubMed

    Chen, Ching-Min; Hong, Mei-Chu; Hsu, Yu-Hsien

    2007-01-01

    To examine the relationship between the capacities of various community organizations and their performance scores for healthy community development. This cross-sectional study was conducted by examining all community organizations involved in the Taiwan national healthy community development project. Of 213 administrators contacted, 195 (a return rate of 91.6%) completed a self-administered questionnaire between October and November 2003. The research instrument was self-developed and based on the Donabedian model. It examined the capacity of the community organizations and their performance in developing a healthy community. The average overall healthy community development performance score was 5.0 on a 7-point semantic differential scale, with the structure variable rated as the lowest among the 3 subscales. Community organization capacities in the areas of funding, resources committed, citizen participation, and certain aspects of organizational leadership were found to be significantly related to healthy community development performance. Each of the regression models showed a different set of capacities for the community organization domains and explained between 25% and 33% of the variance in performance. The study validates the theoretical relationships among the concepts identified in the Donabedian model. Nursing interventions tailored to enhance resident citizen participation in order to promote community coalitions are strongly supported.

  6. Independent external validation of predictive models for urinary dysfunction following external beam radiotherapy of the prostate: Issues in model development and reporting.

    PubMed

    Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W

    2016-08-01

    Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Scaling Equations for Ballistic Modeling of Solid Rocket Motor Case Breach

    NASA Technical Reports Server (NTRS)

    McMillin, Joshua E.

    2006-01-01

    This paper explores the development of a series of scaling equations that can take a known nominal motor performance and scale it for small and growing case failures. This model was developed for the Malfunction-Turn Study as part of Return to Flight activities for the Space Shuttle program. To verify the model, data from the Challenger accident (STS- 51L) were used. The model is able to predict the motor performance beyond the last recorded Challenger data and show how the failed right hand booster would have performed if the vehicle had remained intact.

  8. Development of an Implantable WBAN Path-Loss Model for Capsule Endoscopy

    NASA Astrophysics Data System (ADS)

    Aoyagi, Takahiro; Takizawa, Kenichi; Kobayashi, Takehiko; Takada, Jun-Ichi; Hamaguchi, Kiyoshi; Kohno, Ryuji

    An implantable WBAN path-loss model for a capsule endoscopy which is used for examining digestive organs, is developed by conducting simulations and experiments. First, we performed FDTD simulations on implant WBAN propagation by using a numerical human model. Second, we performed FDTD simulations on a vessel that represents the human body. Third, we performed experiments using a vessel of the same dimensions as that used in the simulations. On the basis of the results of these simulations and experiments, we proposed the gradient and intercept parameters of the simple path-loss in-body propagation model.

  9. Performance modeling of the effects of aperture phase error, turbulence, and thermal blooming on tiled subaperture systems

    NASA Astrophysics Data System (ADS)

    Leakeas, Charles L.; Capehart, Shay R.; Bartell, Richard J.; Cusumano, Salvatore J.; Whiteley, Matthew R.

    2011-06-01

    Laser weapon systems comprised of tiled subapertures are rapidly emerging in importance in the directed energy community. Performance models of these laser weapon systems have been developed from numerical simulations of a high fidelity wave-optics code called WaveTrain which is developed by MZA Associates. System characteristics such as mutual coherence, differential jitter, and beam quality rms wavefront error are defined for a focused beam on the target. Engagement scenarios are defined for various platform and target altitudes, speeds, headings, and slant ranges along with the natural wind speed and heading. Inputs to the performance model include platform and target height and velocities, Fried coherence length, Rytov number, isoplanatic angle, thermal blooming distortion number, Greenwood and Tyler frequencies, and atmospheric transmission. The performance model fit is based on power-in-the-bucket (PIB) values against the PIB from the simulation results for the vacuum diffraction-limited spot size as the bucket. The goal is to develop robust performance models for aperture phase error, turbulence, and thermal blooming effects in tiled subaperture systems.

  10. Marine atmospheric effects on electro-optical systems performance

    NASA Astrophysics Data System (ADS)

    Richter, Juergen H.; Hughes, Herbert G.

    1990-09-01

    For the past twelve years, a coordinated tri-service effort has been underway in the United States Department of Defense to provide an atmospheric effects assessment capability for existing and planned electro-optical (E0) systems. This paper reviews the exploratory development effort in the US Navy. A key responsibility for the Navy was the development of marine aerosol models. An initial model, the Navy Aerosol Model (NAN), was developed, tested, and transitioned into LOWTRAN 6. A more comprehensive model, the Navy Oceanic Vertical Aerosol Model (NOVAM), has been formulated and is presently undergoing comprehensive evaluation and testing. Marine aerosols and their extinction properties are only one important factor in EO systems performance assessment. For many EO systems applications, an accurate knowledge of marine background radiances is required in addition to considering the effects of the intervening atmosphere. Accordingly, a capability was developed to estimate the apparent sea surface radiance for different sea states and meteorological conditions. Also, an empirical relationship was developed which directly relates apparent mean sea temperature to calculated mean sky temperature. In situ measurements of relevant environmental parameters are essential for real-time EO systems performance assessment. Direct measurement of slant path extinction would be most desirable. This motivated a careful investigation of lidar (light detection and ranging) techniques including improvements to single-ended lidar profile inversion algorithms and development of new lidar techniques such as double-ended and dual-angle configurations. It was concluded that single-ended, single frequency lidars can not be used to infer slant path extinction with an accuracy necessary to make meaningful performance assessments. Other lidar configurations may find limited application in model validation and research efforts. No technique has emerged yet which could be considered ready for shipboard implementation. A shipboard real-time performance assessment system was developed and named PREOS (Performance and Range for EO Systems). PREOS has been incorporated into the Navy's Tactical Environmental Support System (TESS). The present version of PREOS is a first step in accomplishing the complex task of real-time systems performance assessment. Improved target and background models are under development and will be incorporated into TESS when tested and validated. A reliable assessment capability can be used to develop Tactical Decision Aids (TDAs). TDAs permit optimum selection or combination of sensors and estimation of a ship's own vulnerability against hostile systems.

  11. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  12. Team deliberate practice in medicine and related domains: a consideration of the issues.

    PubMed

    Harris, Kevin R; Eccles, David W; Shatzer, John H

    2017-03-01

    A better understanding of the factors influencing medical team performance and accounting for expert medical team performance should benefit medical practice. Therefore, the aim here is to highlight key issues with using deliberate practice to improve medical team performance, especially given the success of deliberate practice for developing individual expert performance in medicine and other domains. Highlighting these issues will inform the development of training for medical teams. The authors first describe team coordination and its critical role in medical teams. Presented next are the cognitive mechanisms that allow expert performers to accurately interpret the current situation via the creation of an accurate mental "model" of the current situation, known as a situation model. Following this, the authors propose that effective team performance depends at least in part on team members having similar models of the situation, known as a shared situation model. The authors then propose guiding principles for implementing team deliberate practice in medicine and describe how team deliberate practice can be used in an attempt to reduce barriers inherent in medical teams to the development of shared situation models. The paper concludes with considerations of limitations, and future research directions, concerning the implementation of team deliberate practice within medicine.

  13. Reducing hydrologic model uncertainty in monthly streamflow predictions using multimodel combination

    NASA Astrophysics Data System (ADS)

    Li, Weihua; Sankarasubramanian, A.

    2012-12-01

    Model errors are inevitable in any prediction exercise. One approach that is currently gaining attention in reducing model errors is by combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictions. A new dynamic approach (MM-1) to combine multiple hydrological models by evaluating their performance/skill contingent on the predictor state is proposed. We combine two hydrological models, "abcd" model and variable infiltration capacity (VIC) model, to develop multimodel streamflow predictions. To quantify precisely under what conditions the multimodel combination results in improved predictions, we compare multimodel scheme MM-1 with optimal model combination scheme (MM-O) by employing them in predicting the streamflow generated from a known hydrologic model (abcd model orVICmodel) with heteroscedastic error variance as well as from a hydrologic model that exhibits different structure than that of the candidate models (i.e., "abcd" model or VIC model). Results from the study show that streamflow estimated from single models performed better than multimodels under almost no measurement error. However, under increased measurement errors and model structural misspecification, both multimodel schemes (MM-1 and MM-O) consistently performed better than the single model prediction. Overall, MM-1 performs better than MM-O in predicting the monthly flow values as well as in predicting extreme monthly flows. Comparison of the weights obtained from each candidate model reveals that as measurement errors increase, MM-1 assigns weights equally for all the models, whereas MM-O assigns higher weights for always the best-performing candidate model under the calibration period. Applying the multimodel algorithms for predicting streamflows over four different sites revealed that MM-1 performs better than all single models and optimal model combination scheme, MM-O, in predicting the monthly flows as well as the flows during wetter months.

  14. Theory of constraints for publicly funded health systems.

    PubMed

    Sadat, Somayeh; Carter, Michael W; Golden, Brian

    2013-03-01

    Originally developed in the context of publicly traded for-profit companies, theory of constraints (TOC) improves system performance through leveraging the constraint(s). While the theory seems to be a natural fit for resource-constrained publicly funded health systems, there is a lack of literature addressing the modifications required to adopt TOC and define the goal and performance measures. This paper develops a system dynamics representation of the classical TOC's system-wide goal and performance measures for publicly traded for-profit companies, which forms the basis for developing a similar model for publicly funded health systems. The model is then expanded to include some of the factors that affect system performance, providing a framework to apply TOC's process of ongoing improvement in publicly funded health systems. Future research is required to more accurately define the factors affecting system performance and populate the model with evidence-based estimates for various parameters in order to use the model to guide TOC's process of ongoing improvement.

  15. Real-time individualization of the unified model of performance.

    PubMed

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  16. Enhancing Job Performance

    ERIC Educational Resources Information Center

    Devlin, Patricia

    2011-01-01

    The impact of the Self-Determined Career Development Model (hereafter called the Self-Determined Career Model) on the job performance of four adults with moderate intellectual disability employed in competitive work settings was examined. Employees learned to set work-related goals, develop an action plan, implement the plan, and adjust their…

  17. Shared Mental Models on the Performance of e-Learning Content Development Teams

    ERIC Educational Resources Information Center

    Jo, Il-Hyun

    2012-01-01

    The primary purpose of the study was to investigate team-based e-Learning content development projects from the perspective of the shared mental model (SMM) theory. The researcher conducted a study of 79 e-Learning content development teams in Korea to examine the relationship between taskwork and teamwork SMMs and the performance of the teams.…

  18. Technical Manual for the SAM Physical Trough Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field,more » power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.« less

  19. Capabilities and performance of the new generation ice-sheet model Elmer/Ice

    NASA Astrophysics Data System (ADS)

    Gagliardini, O.; Zwinger, T.; Durand, G.; Favier, L.; de Fleurian, B.; Gillet-chaulet, F.; Seddik, H.; Greve, R.; Mallinen, M.; Martin, C.; Raback, P.; Ruokolainen, J.; Schäfer, M.; Thies, J.

    2012-12-01

    Since the Fourth IPCC Assessment Report, and its conclusion about the inability of ice-sheet flow models to forecast the current increase of polar ice sheet discharge and associated contribution to sea-level rise, a huge development effort has been undertaken by the glaciological community. All around the world, models have been improved and, interestingly, a significant number of new ice-sheet models have emerged. Among them, the parallel finite-element model Elmer/Ice (based on the open-source multi-physics code Elmer) was one of the first full-Stokes models used to make projections of the future of the whole Greenland ice sheet for the coming two centuries. Originally developed to solve dedicated local ice flow problems of high mechanical and physical complexity, Elmer/Ice has today reached the maturity to solve larger scale problems, earning the status of an ice-sheet model. In this presentation, we summarise the almost 10 years of development performed by different groups. We present the components already included in Elmer/Ice, its numerical performance, selected applications, as well as developments planed for the future.

  20. Capabilities and performance of Elmer/Ice, a new generation ice-sheet model

    NASA Astrophysics Data System (ADS)

    Gagliardini, O.; Zwinger, T.; Gillet-Chaulet, F.; Durand, G.; Favier, L.; de Fleurian, B.; Greve, R.; Malinen, M.; Martín, C.; Råback, P.; Ruokolainen, J.; Sacchettini, M.; Schäfer, M.; Seddik, H.; Thies, J.

    2013-03-01

    The Fourth IPCC Assessment Report concluded that ice-sheet flow models are unable to forecast the current increase of polar ice sheet discharge and the associated contribution to sea-level rise. Since then, the glaciological community has undertaken a huge effort to develop and improve a new generation of ice-flow models, and as a result, a significant number of new ice-sheet models have emerged. Among them is the parallel finite-element model Elmer/Ice, based on the open-source multi-physics code Elmer. It was one of the first full-Stokes models used to make projections for the evolution of the whole Greenland ice sheet for the coming two centuries. Originally developed to solve local ice flow problems of high mechanical and physical complexity, Elmer/Ice has today reached the maturity to solve larger scale problems, earning the status of an ice-sheet model. Here, we summarise almost 10 yr of development performed by different groups. We present the components already included in Elmer/Ice, its numerical performance, selected applications, as well as developments planned for the future.

  1. A modified F-test for evaluating model performance by including both experimental and simulation uncertainties

    USDA-ARS?s Scientific Manuscript database

    Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...

  2. Fuel-efficient cruise performance model for general aviation piston engine airplanes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parkinson, R.C.H.

    1982-01-01

    The uses and limitations of typical Pilot Operating Handbook cruise performance data, for constructing cruise performance models suitable for maximizing specific range, are first examined. These data are found to be inadequate for constructing such models. A new model of General Aviation piston-prop airplane cruise performance is then developed. This model consists of two subsystem models: the airframe-propeller-atmosphere subsystem model; and the engine-atmosphere subsystem model. The new model facilitates maximizing specific range; and by virtue of its simplicity and low volume data storage requirements, appears suitable for airborne microprocessor implementation.

  3. POPEYE: A production rule-based model of multitask supervisory control (POPCORN)

    NASA Technical Reports Server (NTRS)

    Townsend, James T.; Kadlec, Helena; Kantowitz, Barry H.

    1988-01-01

    Recent studies of relationships between subjective ratings of mental workload, performance, and human operator and task characteristics have indicated that these relationships are quite complex. In order to study the various relationships and place subjective mental workload within a theoretical framework, we developed a production system model for the performance component of the complex supervisory task called POPCORN. The production system model is represented by a hierarchial structure of goals and subgoals, and the information flow is controlled by a set of condition-action rules. The implementation of this production system, called POPEYE, generates computer simulated data under different task difficulty conditions which are comparable to those of human operators performing the task. This model is the performance aspect of an overall dynamic psychological model which we are developing to examine and quantify relationships between performance and psychological aspects in a complex environment.

  4. Performance measurement for people with multiple chronic conditions: conceptual model.

    PubMed

    Giovannetti, Erin R; Dy, Sydney; Leff, Bruce; Weston, Christine; Adams, Karen; Valuck, Tom B; Pittman, Aisha T; Blaum, Caroline S; McCann, Barbara A; Boyd, Cynthia M

    2013-10-01

    Improving quality of care for people with multiple chronic conditions (MCCs) requires performance measures reflecting the heterogeneity and scope of their care. Since most existing measures are disease specific, performance measures must be refined and new measures must be developed to address the complexity of care for those with MCCs. To describe development of the Performance Measurement for People with Multiple Chronic Conditions (PM-MCC) conceptual model. Framework development and a national stakeholder panel. We used reviews of existing conceptual frameworks of performance measurement, review of the literature on MCCs, input from experts in the multistakeholder Steering Committee, and public comment. The resulting model centers on the patient and family goals and preferences for care in the context of multiple care sites and providers, the type of care they are receiving, and the national priority domains for healthcare quality measurement. This model organizes measures into a comprehensive framework and identifies areas where measures are lacking. In this context, performance measures can be prioritized and implemented at different levels, in the context of patients' overall healthcare needs.

  5. Varying execution discipline to increase performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, P.L.; Maccabe, A.B.

    1993-12-22

    This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less

  6. Charge to Road Map Development Sessions

    NASA Technical Reports Server (NTRS)

    Barth, Janet

    2004-01-01

    Develop a road map for new standard Model applications radiation belt models. Model applications: Spacecraft and instruments. Reduce risk. Reduce cost. Improve performance. Increase system lifetime. Reduce risk to astronauts.

  7. Local spatio-temporal analysis in vision systems

    NASA Astrophysics Data System (ADS)

    Geisler, Wilson S.; Bovik, Alan; Cormack, Lawrence; Ghosh, Joydeep; Gildeen, David

    1994-07-01

    The aims of this project are the following: (1) develop a physiologically and psychophysically based model of low-level human visual processing (a key component of which are local frequency coding mechanisms); (2) develop image models and image-processing methods based upon local frequency coding; (3) develop algorithms for performing certain complex visual tasks based upon local frequency representations, (4) develop models of human performance in certain complex tasks based upon our understanding of low-level processing; and (5) develop a computational testbed for implementing, evaluating and visualizing the proposed models and algorithms, using a massively parallel computer. Progress has been substantial on all aims. The highlights include the following: (1) completion of a number of psychophysical and physiological experiments revealing new, systematic and exciting properties of the primate (human and monkey) visual system; (2) further development of image models that can accurately represent the local frequency structure in complex images; (3) near completion in the construction of the Texas Active Vision Testbed; (4) development and testing of several new computer vision algorithms dealing with shape-from-texture, shape-from-stereo, and depth-from-focus; (5) implementation and evaluation of several new models of human visual performance; and (6) evaluation, purchase and installation of a MasPar parallel computer.

  8. Development and application of theoretical models for Rotating Detonation Engine flowfields

    NASA Astrophysics Data System (ADS)

    Fievisohn, Robert

    As turbine and rocket engine technology matures, performance increases between successive generations of engine development are becoming smaller. One means of accomplishing significant gains in thermodynamic performance and power density is to use detonation-based heat release instead of deflagration. This work is focused on developing and applying theoretical models to aid in the design and understanding of Rotating Detonation Engines (RDEs). In an RDE, a detonation wave travels circumferentially along the bottom of an annular chamber where continuous injection of fresh reactants sustains the detonation wave. RDEs are currently being designed, tested, and studied as a viable option for developing a new generation of turbine and rocket engines that make use of detonation heat release. One of the main challenges in the development of RDEs is to understand the complex flowfield inside the annular chamber. While simplified models are desirable for obtaining timely performance estimates for design analysis, one-dimensional models may not be adequate as they do not provide flow structure information. In this work, a two-dimensional physics-based model is developed, which is capable of modeling the curved oblique shock wave, exit swirl, counter-flow, detonation inclination, and varying pressure along the inflow boundary. This is accomplished by using a combination of shock-expansion theory, Chapman-Jouguet detonation theory, the Method of Characteristics (MOC), and other compressible flow equations to create a shock-fitted numerical algorithm and generate an RDE flowfield. This novel approach provides a numerically efficient model that can provide performance estimates as well as details of the large-scale flow structures in seconds on a personal computer. Results from this model are validated against high-fidelity numerical simulations that may require a high-performance computing framework to provide similar performance estimates. This work provides a designer a new tool to conduct large-scale parametric studies to optimize a design space before conducting computationally-intensive, high-fidelity simulations that may be used to examine additional effects. The work presented in this thesis not only bridges the gap between simple one-dimensional models and high-fidelity full numerical simulations, but it also provides an effective tool for understanding and exploring RDE flow processes.

  9. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  10. Performance and cost benefits associated with nonimaging secondary concentrators used in point-focus dish solar thermal applications

    NASA Astrophysics Data System (ADS)

    Ogallagher, J.; Winston, R.

    1987-09-01

    Using nonimaging secondary concentrators in point-focus applications may permit the development of more cost-effective concentrator systems by either improving performance or reducing costs. Secondaries may also increase design flexibility. The major objective of this study was to develop as complete an understanding as possible of the quantitative performance and cost effects associated with deploying nonimaging secondary concentrators at the focal zone of point-focus solar thermal concentrators. A performance model was developed that uses a Monte Carlo ray-trace procedure to determine the focal plane distribution of a paraboloidal primary as a function of optical parameters. It then calculates the corresponding optimized concentration and thermal efficiency as a function of temperature with and without the secondary. To examine the potential cost benefits associated with secondaries, a preliminary model for the rational optimization of performance versus cost trade-offs was developed. This model suggests a possible 10 to 20 percent reduction in the cost of delivered energy when secondaries are used. This is a lower limit, and the benefits may even be greater if using a secondary permits the development of inexpensive primary technologies for which the performance would not otherwise be viable.

  11. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Test techniques for model development of repetitive service energy storage capacitors

    NASA Astrophysics Data System (ADS)

    Thompson, M. C.; Mauldin, G. H.

    1984-03-01

    The performance of the Sandia perfluorocarbon family of energy storage capacitors was evaluated. The capacitors have a much lower charge noise signature creating new instrumentation performance goals. Thermal response to power loading and the importance of average and spot heating in the bulk regions require technical advancements in real time temperature measurements. Reduction and interpretation of thermal data are crucial to the accurate development of an intelligent thermal transport model. The thermal model is of prime interest in the high repetition rate, high average power applications of power conditioning capacitors. The accurate identification of device parasitic parameters has ramifications in both the average power loss mechanisms and peak current delivery. Methods to determine the parasitic characteristics and their nonlinearities and terminal effects are considered. Meaningful interpretations for model development, performance history, facility development, instrumentation, plans for the future, and present data are discussed.

  13. Scripting MODFLOW model development using Python and FloPy

    USGS Publications Warehouse

    Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.

    2016-01-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.

  14. Human Performance Models of Pilot Behavior

    NASA Technical Reports Server (NTRS)

    Foyle, David C.; Hooey, Becky L.; Byrne, Michael D.; Deutsch, Stephen; Lebiere, Christian; Leiden, Ken; Wickens, Christopher D.; Corker, Kevin M.

    2005-01-01

    Five modeling teams from industry and academia were chosen by the NASA Aviation Safety and Security Program to develop human performance models (HPM) of pilots performing taxi operations and runway instrument approaches with and without advanced displays. One representative from each team will serve as a panelist to discuss their team s model architecture, augmentations and advancements to HPMs, and aviation-safety related lessons learned. Panelists will discuss how modeling results are influenced by a model s architecture and structure, the role of the external environment, specific modeling advances and future directions and challenges for human performance modeling in aviation.

  15. Computer-Assisted Decision Support for Student Admissions Based on Their Predicted Academic Performance.

    PubMed

    Muratov, Eugene; Lewis, Margaret; Fourches, Denis; Tropsha, Alexander; Cox, Wendy C

    2017-04-01

    Objective. To develop predictive computational models forecasting the academic performance of students in the didactic-rich portion of a doctor of pharmacy (PharmD) curriculum as admission-assisting tools. Methods. All PharmD candidates over three admission cycles were divided into two groups: those who completed the PharmD program with a GPA ≥ 3; and the remaining candidates. Random Forest machine learning technique was used to develop a binary classification model based on 11 pre-admission parameters. Results. Robust and externally predictive models were developed that had particularly high overall accuracy of 77% for candidates with high or low academic performance. These multivariate models were highly accurate in predicting these groups to those obtained using undergraduate GPA and composite PCAT scores only. Conclusion. The models developed in this study can be used to improve the admission process as preliminary filters and thus quickly identify candidates who are likely to be successful in the PharmD curriculum.

  16. Marital Conflict, Allostatic Load, and the Development of Children's Fluid Cognitive Performance

    PubMed Central

    Hinnant, J. Benjamin; El-Sheikh, Mona; Keiley, Margaret; Buckhalt, Joseph A.

    2013-01-01

    Relations between marital conflict, children’s respiratory sinus arrhythmia (RSA), and fluid cognitive performance were examined over three years to assess allostatic processes. Participants were 251 children reporting on marital conflict, baseline RSA and RSA reactivity to a lab challenge were recorded, and fluid cognitive performance was measured using the Woodcock-Johnson III. A cross-lagged model showed that higher levels of marital conflict at age 8 predicted weaker RSA-R at age 9 for children with lower baseline RSA. A growth model showed that lower baseline RSA in conjunction with weaker RSA-R predicted the slowest development of fluid cognitive performance. Findings suggest that stress may affect development of physiological systems regulating attention, which are tied to the development of fluid cognitive performance. PMID:23534537

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mou, J.I.; King, C.

    The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess themore » status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.« less

  18. Wave and Wind Model Performance Metrics Tools

    NASA Astrophysics Data System (ADS)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base with which statistics are readily calculated, for the short or long term. Such a system has potential for a quick transition to operations at NAVOCEANO.

  19. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    PubMed

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.

  20. A parallel calibration utility for WRF-Hydro on high performance computers

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, C.; Kotamarthi, V. R.

    2017-12-01

    A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.

  1. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  2. Performance Technology--Not a One-Size-Fits-All Profession

    ERIC Educational Resources Information Center

    Dierkes, Sunda V.

    2012-01-01

    The current debate over whether to choose just one universal human performance technology (HPT) model, in particular Langdon's language of work (LOW) model, promises a shared understanding among HPT professionals, credibility for the HPT profession, and a return on investment of time and effort in developing performance models over more than 70…

  3. Managing the "Performance" in Performance Management.

    ERIC Educational Resources Information Center

    Repinski, Marilyn; Bartsch, Maryjo

    1996-01-01

    Describes a five-step approach to performance management which includes (1) redefining tasks; (2) identifying skills; (3) determining what development tools are necessary; (4) prioritizing skills development; and (5) developing an action plan. Presents a hiring model that includes job analysis, job description, selection, goal setting, evaluation,…

  4. Thermal model of attic systems with radiant barriers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkes, K.E.

    This report summarizes the first phase of a project to model the thermal performance of radiant barriers. The objective of this phase of the project was to develop a refined model for the thermal performance of residential house attics, with and without radiant barriers, and to verify the model by comparing its predictions against selected existing experimental thermal performance data. Models for the thermal performance of attics with and without radiant barriers have been developed and implemented on an IBM PC/AT computer. The validity of the models has been tested by comparing their predictions with ceiling heat fluxes measured inmore » a number of laboratory and field experiments on attics with and without radiant barriers. Cumulative heat flows predicted by the models were usually within about 5 to 10 percent of measured values. In future phases of the project, the models for attic/radiant barrier performance will be coupled with a whole-house model and further comparisons with experimental data will be made. Following this, the models will be utilized to provide an initial assessment of the energy savings potential of radiant barriers in various configurations and under various climatic conditions. 38 refs., 14 figs., 22 tabs.« less

  5. Neurocognitive predictors of financial capacity in traumatic brain injury.

    PubMed

    Martin, Roy C; Triebel, Kristen; Dreer, Laura E; Novack, Thomas A; Turner, Crystal; Marson, Daniel C

    2012-01-01

    To develop cognitive models of financial capacity (FC) in patients with traumatic brain injury (TBI). Longitudinal design. Inpatient brain injury rehabilitation unit. Twenty healthy controls, and 24 adults with moderate-to-severe TBI were assessed at baseline (30 days postinjury) and 6 months postinjury. The FC instrument (FCI) and a neuropsychological test battery. Univariate correlation and multiple regression procedures were employed to develop cognitive models of FCI performance in the TBI group, at baseline and 6-month time follow-up. Three cognitive predictor models of FC were developed. At baseline, measures of mental arithmetic/working memory and immediate verbal memory predicted baseline FCI performance (R = 0.72). At 6-month follow-up, measures of executive function and mental arithmetic/working memory predicted 6-month FCI performance (R = 0.79), and a third model found that these 2 measures at baseline predicted 6-month FCI performance (R = 0.71). Multiple cognitive functions are associated with initial impairment and partial recovery of FC in moderate-to-severe TBI patients. In particular, arithmetic, working memory, and executive function skills appear critical to recovery of FC in TBI. The study results represent an initial step toward developing a neurocognitive model of FC in patients with TBI.

  6. Terahertz standoff imaging testbed design and performance for concealed weapon and device identification model development

    NASA Astrophysics Data System (ADS)

    Franck, Charmaine C.; Lee, Dave; Espinola, Richard L.; Murrill, Steven R.; Jacobs, Eddie L.; Griffin, Steve T.; Petkie, Douglas T.; Reynolds, Joe

    2007-04-01

    This paper describes the design and performance of the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate's (NVESD), active 0.640-THz imaging testbed, developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. The laboratory measurements and standoff images were acquired during the development of a NVESD and Army Research Laboratory terahertz imaging performance model. The imaging testbed is based on a 12-inch-diameter Off-Axis Elliptical (OAE) mirror designed with one focal length at 1 m and the other at 10 m. This paper will describe the design considerations of the OAE-mirror, dual-capability, active imaging testbed, as well as measurement/imaging results used to further develop the model.

  7. River Devices to Recover Energy with Advanced Materials (River DREAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, Daniel P.

    2013-07-03

    The purpose of this project is to develop a generator called a Galloping Hydroelectric Energy Extraction Device (GHEED). It uses a galloping prism to convert water flow into linear motion. This motion is converted into electricity via a dielectric elastomer generator (DEG). The galloping mechanism and the DEG are combined to create a system to effectively generate electricity. This project has three research objectives: 1. Oscillator development and design a. Characterize galloping behavior, evaluate control surface shape change on oscillator performance and demonstrate shape change with water flow change. 2. Dielectric Energy Generator (DEG) characterization and modeling a. Characterize andmore » model the performance of the DEG based on oscillator design 3. Galloping Hydroelectric Energy Extraction Device (GHEED) system modeling and integration a. Create numerical models for construction of a system performance model and define operating capabilities for this approach Accomplishing these three objectives will result in the creation of a model that can be used to fully define the operating parameters and performance capabilities of a generator based on the GHEED design. This information will be used in the next phase of product development, the creation of an integrated laboratory scale generator to confirm model predictions.« less

  8. Modeling Long-term Creep Performance for Welded Nickel-base Superalloy Structures for Power Generation Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Chen; Gupta, Vipul; Huang, Shenyan

    The goal of this project is to model long-term creep performance for nickel-base superalloy weldments in high temperature power generation systems. The project uses physics-based modeling methodologies and algorithms for predicting alloy properties in heterogeneous material structures. The modeling methodology will be demonstrated on a gas turbine combustor liner weldment of Haynes 282 precipitate-strengthened nickel-base superalloy. The major developments are: (1) microstructure-property relationships under creep conditions and microstructure characterization (2) modeling inhomogeneous microstructure in superalloy weld (3) modeling mesoscale plastic deformation in superalloy weld and (4) a constitutive creep model that accounts for weld and base metal microstructure and theirmore » long term evolution. The developed modeling technology is aimed to provide a more efficient and accurate assessment of a material’s long-term performance compared with current testing and extrapolation methods. This modeling technology will also accelerate development and qualification of new materials in advanced power generation systems. This document is a final technical report for the project, covering efforts conducted from October 2014 to December 2016.« less

  9. Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II

    2010-01-01

    The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle

  10. Experimental and analytical studies of advanced air cushion landing systems

    NASA Technical Reports Server (NTRS)

    Lee, E. G. S.; Boghani, A. B.; Captain, K. M.; Rutishauser, H. J.; Farley, H. L.; Fish, R. B.; Jeffcoat, R. L.

    1981-01-01

    Several concepts are developed for air cushion landing systems (ACLS) which have the potential for improving performance characteristics (roll stiffness, heave damping, and trunk flutter), and reducing fabrication cost and complexity. After an initial screening, the following five concepts were evaluated in detail: damped trunk, filled trunk, compartmented trunk, segmented trunk, and roll feedback control. The evaluation was based on tests performed on scale models. An ACLS dynamic simulation developed earlier is updated so that it can be used to predict the performance of full-scale ACLS incorporating these refinements. The simulation was validated through scale-model tests. A full-scale ACLS based on the segmented trunk concept was fabricated and installed on the NASA ACLS test vehicle, where it is used to support advanced system development. A geometrically-scaled model (one third full scale) of the NASA test vehicle was fabricated and tested. This model, evaluated by means of a series of static and dynamic tests, is used to investigate scaling relationships between reduced and full-scale models. The analytical model developed earlier is applied to simulate both the one third scale and the full scale response.

  11. An Introduction to the Partial Credit Model for Developing Nursing Assessments.

    ERIC Educational Resources Information Center

    Fox, Christine

    1999-01-01

    Demonstrates how the partial credit model, a variation of the Rasch Measurement Model, can be used to develop performance-based assessments for nursing education. Applies the model using the Practical Knowledge Inventory for Nurses. (SK)

  12. Model development and system performance optimization for staring infrared search and track (IRST) sensors

    NASA Astrophysics Data System (ADS)

    Olson, Craig; Theisen, Michael; Pace, Teresa; Halford, Carl; Driggers, Ronald

    2016-05-01

    The mission of an Infrared Search and Track (IRST) system is to detect and locate (sometimes called find and fix) enemy aircraft at significant ranges. Two extreme opposite examples of IRST applications are 1) long range offensive aircraft detection when electronic warfare equipment is jammed, compromised, or intentionally turned off, and 2) distributed aperture systems where enemy aircraft may be in the proximity of the host aircraft. Past IRST systems have been primarily long range offensive systems that were based on the LWIR second generation thermal imager. The new IRST systems are primarily based on staring infrared focal planes and sensors. In the same manner that FLIR92 did not work well in the design of staring infrared cameras (NVTherm was developed to address staring infrared sensor performance), current modeling techniques do not adequately describe the performance of a staring IRST sensor. There are no standard military IRST models (per AFRL and NAVAIR), and each program appears to perform their own modeling. For this reason, L-3 has decided to develop a corporate model, working with AFRL and NAVAIR, for the analysis, design, and evaluation of IRST concepts, programs, and solutions. This paper provides some of the first analyses in the L-3 IRST model development program for the optimization of staring IRST sensors.

  13. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  14. Unified Performance and Power Modeling of Scientific Workloads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.

    2013-11-17

    It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less

  15. The Social Responsibility Performance Outcomes Model: Building Socially Responsible Companies through Performance Improvement Outcomes.

    ERIC Educational Resources Information Center

    Hatcher, Tim

    2000-01-01

    Considers the role of performance improvement professionals and human resources development professionals in helping organizations realize the ethical and financial power of corporate social responsibility. Explains the social responsibility performance outcomes model, which incorporates the concepts of societal needs and outcomes. (LRW)

  16. Design and Testing of a Liquid Nitrous Oxide and Ethanol Fueled Rocket Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youngblood, Stewart

    A small-scale, bi-propellant, liquid fueled rocket engine and supporting test infrastructure were designed and constructed at the Energetic Materials Research and Testing Center (EMRTC). This facility was used to evaluate liquid nitrous oxide and ethanol as potential rocket propellants. Thrust and pressure measurements along with high-speed digital imaging of the rocket exhaust plume were made. This experimental data was used for validation of a computational model developed of the rocket engine tested. The developed computational model was utilized to analyze rocket engine performance across a range of operating pressures, fuel-oxidizer mixture ratios, and outlet nozzle configurations. A comparative study ofmore » the modeling of a liquid rocket engine was performed using NASA CEA and Cantera, an opensource equilibrium code capable of being interfaced with MATLAB. One goal of this modeling was to demonstrate the ability of Cantera to accurately model the basic chemical equilibrium, thermodynamics, and transport properties for varied fuel and oxidizer operating conditions. Once validated for basic equilibrium, an expanded MATLAB code, referencing Cantera, was advanced beyond CEAs capabilities to predict rocket engine performance as a function of supplied propellant flow rate and rocket engine nozzle dimensions. Cantera was found to comparable favorably to CEA for making equilibrium calculations, supporting its use as an alternative to CEA. The developed rocket engine performs as predicted, demonstrating the developedMATLAB rocket engine model was successful in predicting real world rocket engine performance. Finally, nitrous oxide and ethanol were shown to perform well as rocket propellants, with specific impulses experimentally recorded in the range of 250 to 260 seconds.« less

  17. Trajectories of Infants' Biobehavioral Development: Timing and Rate of A-Not-B Performance Gains and EEG Maturation.

    PubMed

    MacNeill, Leigha A; Ram, Nilam; Bell, Martha Ann; Fox, Nathan A; Pérez-Edgar, Koraly

    2018-05-01

    This study examined how timing (i.e., relative maturity) and rate (i.e., how quickly infants attain proficiency) of A-not-B performance were related to changes in brain activity from age 6 to 12 months. A-not-B performance and resting EEG (electroencephalography) were measured monthly from age 6 to 12 months in 28 infants and were modeled using logistic and linear growth curve models. Infants with faster performance rates reached performance milestones earlier. Infants with faster rates of increase in A-not-B performance had lower occipital power at 6 months and greater linear increases in occipital power. The results underscore the importance of considering nonlinear change processes for studying infants' cognitive development as well as how these changes are related to trajectories of EEG power. © 2018 The Authors. Child Development © 2018 Society for Research in Child Development, Inc.

  18. PHYSICAL COAL-CLEANING/FLUE GAS DESULFURIZATION COMPUTER MODEL

    EPA Science Inventory

    The model consists of four programs: (1) one, initially developed by Battell-Columbus Laboratories, obtained from Versar, Inc.; (2) one developed by TVA; and (3,4) two developed by TVA and Bechtel National, Inc. The model produces design performance criteria and estimates of capi...

  19. Development of Novel PEM Membrane and Multiphase CD Modeling of PEM Fuel Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. J. Berry; Susanta Das

    2009-12-30

    To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtainedmore » from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance. To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtained from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance.« less

  20. The development and testing of a skin tear risk assessment tool.

    PubMed

    Newall, Nelly; Lewin, Gill F; Bulsara, Max K; Carville, Keryln J; Leslie, Gavin D; Roberts, Pam A

    2017-02-01

    The aim of the present study is to develop a reliable and valid skin tear risk assessment tool. The six characteristics identified in a previous case control study as constituting the best risk model for skin tear development were used to construct a risk assessment tool. The ability of the tool to predict skin tear development was then tested in a prospective study. Between August 2012 and September 2013, 1466 tertiary hospital patients were assessed at admission and followed up for 10 days to see if they developed a skin tear. The predictive validity of the tool was assessed using receiver operating characteristic (ROC) analysis. When the tool was found not to have performed as well as hoped, secondary analyses were performed to determine whether a potentially better performing risk model could be identified. The tool was found to have high sensitivity but low specificity and therefore have inadequate predictive validity. Secondary analysis of the combined data from this and the previous case control study identified an alternative better performing risk model. The tool developed and tested in this study was found to have inadequate predictive validity. The predictive validity of an alternative, more parsimonious model now needs to be tested. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  1. Development of an algorithm to model an aircraft equipped with a generic CDTI display

    NASA Technical Reports Server (NTRS)

    Driscoll, W. C.; Houck, J. A.

    1986-01-01

    A model of human pilot performance of a tracking task using a generic Cockpit Display of Traffic Information (CDTI) display is developed from experimental data. The tracking task is to use CDTI in tracking a leading aircraft at a nominal separation of three nautical miles over a prescribed trajectory in space. The analysis of the data resulting from a factorial design of experiments reveals that the tracking task performance depends on the pilot and his experience at performing the task. Performance was not strongly affected by the type of control system used (velocity vector control wheel steering versus 3D automatic flight path guidance and control). The model that is developed and verified results in state trajectories whose difference from the experimental state trajectories is small compared to the variation due to the pilot and experience factors.

  2. Development of the Transportation Revenue Estimator and Needs Determination System (TRENDS) forecasting model : MPO sub-models and maintenance.

    DOT National Transportation Integrated Search

    2011-11-01

    This report summarizes the technical work performed developing and incorporating Metropolitan Planning : Organization sub-models into the existing Texas Revenue Estimator and Needs Determination System : (TRENDS) model. Additionally, this report expl...

  3. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  4. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  5. Indonesian Private University Lecturer Performance Improvement Model to Improve a Sustainable Organization Performance

    ERIC Educational Resources Information Center

    Suryaman

    2018-01-01

    Lecturer performance will affect the quality and carrying capacity of the sustainability of an organization, in this case the university. There are many models developed to measure the performance of teachers, but not much to discuss the influence of faculty performance itself towards sustainability of an organization. This study was conducted in…

  6. The effect of a loss of model structural detail due to network skeletonization on contamination warning system design: case studies.

    PubMed

    Davis, Michael J; Janke, Robert

    2018-01-04

    The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.

  7. The effect of a loss of model structural detail due to network skeletonization on contamination warning system design: case studies

    NASA Astrophysics Data System (ADS)

    Davis, Michael J.; Janke, Robert

    2018-05-01

    The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.

  8. A computationally efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Dini, Paolo; Maughmer, Mark D.

    1989-01-01

    The goal is to accurately predict the characteristics of the laminar separation bubble and its effects on airfoil performance. Toward this end, a computational model of the separation bubble was developed and incorporated into the Eppler and Somers airfoil design and analysis program. Thus far, the focus of the research was limited to the development of a model which can accurately predict situations in which the interaction between the bubble and the inviscid velocity distribution is weak, the so-called short bubble. A summary of the research performed in the past nine months is presented. The bubble model in its present form is then described. Lastly, the performance of this model in predicting bubble characteristics is shown for a few cases.

  9. Modeling synchronous voltage source converters in transmission system planning studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosterev, D.N.

    1997-04-01

    A Voltage Source Converter (VSC) can be beneficial to power utilities in many ways. To evaluate the VSC performance in potential applications, the device has to be represented appropriately in planning studies. This paper addresses VSC modeling for EMTP, powerflow, and transient stability studies. First, the VSC operating principles are overviewed, and the device model for EMTP studies is presented. The ratings of VSC components are discussed, and the device operating characteristics are derived based on these ratings. A powerflow model is presented and various control modes are proposed. A detailed stability model is developed, and its step-by-step initialization proceduremore » is described. A simplified stability model is also derived under stated assumptions. Finally, validation studies are performed to demonstrate performance of developed stability models and to compare it with EMTP simulations.« less

  10. Development of a Solid-Oxide Fuel Cell/Gas Turbine Hybrid System Model for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Freeh, Joshua E.; Pratt, Joseph W.; Brouwer, Jacob

    2004-01-01

    Recent interest in fuel cell-gas turbine hybrid applications for the aerospace industry has led to the need for accurate computer simulation models to aid in system design and performance evaluation. To meet this requirement, solid oxide fuel cell (SOFC) and fuel processor models have been developed and incorporated into the Numerical Propulsion Systems Simulation (NPSS) software package. The SOFC and reformer models solve systems of equations governing steady-state performance using common theoretical and semi-empirical terms. An example hybrid configuration is presented that demonstrates the new capability as well as the interaction with pre-existing gas turbine and heat exchanger models. Finally, a comparison of calculated SOFC performance with experimental data is presented to demonstrate model validity. Keywords: Solid Oxide Fuel Cell, Reformer, System Model, Aerospace, Hybrid System, NPSS

  11. Measuring nursing competencies in the operating theatre: instrument development and psychometric analysis using Item Response Theory.

    PubMed

    Nicholson, Patricia; Griffin, Patrick; Gillis, Shelley; Wu, Margaret; Dunning, Trisha

    2013-09-01

    Concern about the process of identifying underlying competencies that contribute to effective nursing performance has been debated with a lack of consensus surrounding an approved measurement instrument for assessing clinical performance. Although a number of methodologies are noted in the development of competency-based assessment measures, these studies are not without criticism. The primary aim of the study was to develop and validate a Performance Based Scoring Rubric, which included both analytical and holistic scales. The aim included examining the validity and reliability of the rubric, which was designed to measure clinical competencies in the operating theatre. The fieldwork observations of 32 nurse educators and preceptors assessing the performance of 95 instrument nurses in the operating theatre were used in the calibration of the rubric. The Rasch model, a particular model among Item Response Models, was used in the calibration of each item in the rubric in an attempt at improving the measurement properties of the scale. This is done by establishing the 'fit' of the data to the conditions demanded by the Rasch model. Acceptable reliability estimates, specifically a high Cronbach's alpha reliability coefficient (0.940), as well as empirical support for construct and criterion validity for the rubric were achieved. Calibration of the Performance Based Scoring Rubric using Rasch model revealed that the fit statistics for most items were acceptable. The use of the Rasch model offers a number of features in developing and refining healthcare competency-based assessments, improving confidence in measuring clinical performance. The Rasch model was shown to be useful in developing and validating a competency-based assessment for measuring the competence of the instrument nurse in the operating theatre with implications for use in other areas of nursing practice. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  12. Modeling hurricane evacuation traffic : development of a time-dependent hurricane evacuation demand model.

    DOT National Transportation Integrated Search

    2008-04-01

    The objective of this research is to develop alternative time-dependent travel demand models of hurricane evacuation travel and to compare the performance of these models with each other and with the state-of-the-practice models in current use. Speci...

  13. Models for evaluating the performability of degradable computing systems

    NASA Technical Reports Server (NTRS)

    Wu, L. T.

    1982-01-01

    Recent advances in multiprocessor technology established the need for unified methods to evaluate computing systems performance and reliability. In response to this modeling need, a general modeling framework that permits the modeling, analysis and evaluation of degradable computing systems is considered. Within this framework, several user oriented performance variables are identified and shown to be proper generalizations of the traditional notions of system performance and reliability. Furthermore, a time varying version of the model is developed to generalize the traditional fault tree reliability evaluation methods of phased missions.

  14. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    PubMed

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the 'Gail 2' model showed the average C statistic was 0.63 (95% CI 0.59-0.67), and the expected/observed ratio of events varied considerably across studies (95% prediction interval for E/O ratio when the model was applied in practice was 0.75-1.19). There is a need for models with better predictive performance but, given the large amount of work already conducted, further improvement of existing models based on conventional risk factors is perhaps unlikely. Research to identify new risk factors with large additionally predictive ability is therefore needed, alongside clearer reporting and continual validation of new models as they develop.

  15. The Role of Multimodel Combination in Improving Streamflow Prediction

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Li, W.

    2008-12-01

    Model errors are the inevitable part in any prediction exercise. One approach that is currently gaining attention to reduce model errors is by optimally combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictability. In this study, we present a new approach to combine multiple hydrological models by evaluating their predictability contingent on the predictor state. We combine two hydrological models, 'abcd' model and Variable Infiltration Capacity (VIC) model, with each model's parameter being estimated by two different objective functions to develop multimodel streamflow predictions. The performance of multimodel predictions is compared with individual model predictions using correlation, root mean square error and Nash-Sutcliffe coefficient. To quantify precisely under what conditions the multimodel predictions result in improved predictions, we evaluate the proposed algorithm by testing it against streamflow generated from a known model ('abcd' model or VIC model) with errors being homoscedastic or heteroscedastic. Results from the study show that streamflow simulated from individual models performed better than multimodels under almost no model error. Under increased model error, the multimodel consistently performed better than the single model prediction in terms of all performance measures. The study also evaluates the proposed algorithm for streamflow predictions in two humid river basins from NC as well as in two arid basins from Arizona. Through detailed validation in these four sites, the study shows that multimodel approach better predicts the observed streamflow in comparison to the single model predictions.

  16. Development and Validation of High Precision Thermal, Mechanical, and Optical Models for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles

    2006-01-01

    SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.

  17. Rotor design for maneuver performance

    NASA Technical Reports Server (NTRS)

    Berry, John D.; Schrage, Daniel

    1986-01-01

    A method of determining the sensitivity of helicopter maneuver performance to changes in basic rotor design parameters is developed. Maneuver performance is measured by the time required, based on a simplified rotor/helicopter performance model, to perform a series of specified maneuvers. This method identifies parameter values which result in minimum time quickly because of the inherent simplicity of the rotor performance model used. For the specific case studied, this method predicts that the minimum time required is obtained with a low disk loading and a relatively high rotor solidity. The method was developed as part of the winning design effort for the American Helicopter Society student design competition for 1984/1985.

  18. A prediction model for early death in non-small cell lung cancer patients following curative-intent chemoradiotherapy.

    PubMed

    Jochems, Arthur; El-Naqa, Issam; Kessler, Marc; Mayo, Charles S; Jolly, Shruti; Matuszak, Martha; Faivre-Finn, Corinne; Price, Gareth; Holloway, Lois; Vinod, Shalini; Field, Matthew; Barakat, Mohamed Samir; Thwaites, David; de Ruysscher, Dirk; Dekker, Andre; Lambin, Philippe

    2018-02-01

    Early death after a treatment can be seen as a therapeutic failure. Accurate prediction of patients at risk for early mortality is crucial to avoid unnecessary harm and reducing costs. The goal of our work is two-fold: first, to evaluate the performance of a previously published model for early death in our cohorts. Second, to develop a prognostic model for early death prediction following radiotherapy. Patients with NSCLC treated with chemoradiotherapy or radiotherapy alone were included in this study. Four different cohorts from different countries were available for this work (N = 1540). The previous model used age, gender, performance status, tumor stage, income deprivation, no previous treatment given (yes/no) and body mass index to make predictions. A random forest model was developed by learning on the Maastro cohort (N = 698). The new model used performance status, age, gender, T and N stage, total tumor volume (cc), total tumor dose (Gy) and chemotherapy timing (none, sequential, concurrent) to make predictions. Death within 4 months of receiving the first radiotherapy fraction was used as the outcome. Early death rates ranged from 6 to 11% within the four cohorts. The previous model performed with AUC values ranging from 0.54 to 0.64 on the validation cohorts. Our newly developed model had improved AUC values ranging from 0.62 to 0.71 on the validation cohorts. Using advanced machine learning methods and informative variables, prognostic models for early mortality can be developed. Development of accurate prognostic tools for early mortality is important to inform patients about treatment options and optimize care.

  19. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    PubMed

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  20. Artificial Intelligence Techniques for Predicting and Mapping Daily Pan Evaporation

    NASA Astrophysics Data System (ADS)

    Arunkumar, R.; Jothiprakash, V.; Sharma, Kirty

    2017-09-01

    In this study, Artificial Intelligence techniques such as Artificial Neural Network (ANN), Model Tree (MT) and Genetic Programming (GP) are used to develop daily pan evaporation time-series (TS) prediction and cause-effect (CE) mapping models. Ten years of observed daily meteorological data such as maximum temperature, minimum temperature, relative humidity, sunshine hours, dew point temperature and pan evaporation are used for developing the models. For each technique, several models are developed by changing the number of inputs and other model parameters. The performance of each model is evaluated using standard statistical measures such as Mean Square Error, Mean Absolute Error, Normalized Mean Square Error and correlation coefficient (R). The results showed that daily TS-GP (4) model predicted better with a correlation coefficient of 0.959 than other TS models. Among various CE models, CE-ANN (6-10-1) resulted better than MT and GP models with a correlation coefficient of 0.881. Because of the complex non-linear inter-relationship among various meteorological variables, CE mapping models could not achieve the performance of TS models. From this study, it was found that GP performs better for recognizing single pattern (time series modelling), whereas ANN is better for modelling multiple patterns (cause-effect modelling) in the data.

  1. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  2. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functionalmore » characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.« less

  3. High Performance, Robust Control of Flexible Space Structures: MSFC Center Director's Discretionary Fund

    NASA Technical Reports Server (NTRS)

    Whorton, M. S.

    1998-01-01

    Many spacecraft systems have ambitious objectives that place stringent requirements on control systems. Achievable performance is often limited because of difficulty of obtaining accurate models for flexible space structures. To achieve sufficiently high performance to accomplish mission objectives may require the ability to refine the control design model based on closed-loop test data and tune the controller based on the refined model. A control system design procedure is developed based on mixed H2/H(infinity) optimization to synthesize a set of controllers explicitly trading between nominal performance and robust stability. A homotopy algorithm is presented which generates a trajectory of gains that may be implemented to determine maximum achievable performance for a given model error bound. Examples show that a better balance between robustness and performance is obtained using the mixed H2/H(infinity) design method than either H2 or mu-synthesis control design. A second contribution is a new procedure for closed-loop system identification which refines parameters of a control design model in a canonical realization. Examples demonstrate convergence of the parameter estimation and improved performance realized by using the refined model for controller redesign. These developments result in an effective mechanism for achieving high-performance control of flexible space structures.

  4. Glass sample characterization

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees

    1990-01-01

    The development of in-house integrated optical performance modelling capability at MSFC is described. This performance model will take into account the effects of structural and thermal distortions, as well as metrology errors in optical surfaces to predict the performance of large an complex optical systems, such as Advanced X-Ray Astrophysics Facility. The necessary hardware and software were identified to implement an integrated optical performance model. A number of design, development, and testing tasks were supported to identify the debonded mirror pad, and rebuilding of the Technology Mirror Assembly. Over 300 samples of Zerodur were prepared in different sizes and shapes for acid etching, coating, and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations.

  5. Discrete tyre model application for evaluation of vehicle limit handling performance

    NASA Astrophysics Data System (ADS)

    Siramdasu, Y.; Taheri, S.

    2016-11-01

    The goal of this study is twofold, first, to understand the transient and nonlinear effects of anti-lock braking systems (ABS), road undulations and driving dynamics on lateral performance of tyre and second, to develop objective handling manoeuvres and respective metrics to characterise these effects on vehicle behaviour. For studying the transient and nonlinear handling performance of the vehicle, the variations of relaxation length of tyre and tyre inertial properties play significant roles [Pacejka HB. Tire and vehicle dynamics. 3rd ed. Butterworth-Heinemann; 2012]. To accurately simulate these nonlinear effects during high-frequency vehicle dynamic manoeuvres, requires a high-frequency dynamic tyre model (? Hz). A 6 DOF dynamic tyre model integrated with enveloping model is developed and validated using fixed axle high-speed oblique cleat experimental data. Commercially available vehicle dynamics software CarSim® is used for vehicle simulation. The vehicle model was validated by comparing simulation results with experimental sinusoidal steering tests. The validated tyre model is then integrated with vehicle model and a commercial grade rule-based ABS model to perform various objective simulations. Two test scenarios of ABS braking in turn on a smooth road and accelerating in a turn on uneven and smooth roads are considered. Both test cases reiterated that while the tyre is operating in the nonlinear region of slip or slip angle, any road disturbance or high-frequency brake torque input variations can excite the inertial belt vibrations of the tyre. It is shown that these inertial vibrations can directly affect the developed performance metrics and potentially degrade the handling performance of the vehicle.

  6. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  7. lean-ISD.

    ERIC Educational Resources Information Center

    Wallace, Guy W.

    2001-01-01

    Explains lean instructional systems design/development (ISD) as it relates to curriculum architecture design, based on Japan's lean production system. Discusses performance-based systems; ISD models; processes for organizational training and development; curriculum architecture to support job performance; and modular curriculum development. (LRW)

  8. Development of speed models for improving travel forecasting and highway performance evaluation : [technical summary].

    DOT National Transportation Integrated Search

    2013-12-01

    Travel forecasting models predict travel demand based on the present transportation system and its use. Transportation modelers must develop, validate, and calibrate models to ensure that predicted travel demand is as close to reality as possible. Mo...

  9. Summary of photovoltaic system performance models

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. J.

    1984-01-01

    A detailed overview of photovoltaics (PV) performance modeling capabilities developed for analyzing PV system and component design and policy issues is provided. A set of 10 performance models are selected which span a representative range of capabilities from generalized first order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. The issues are discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. The models are grouped into categories to illustrate their purposes and perspectives.

  10. Development of a normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism in nasopharyngeal carcinoma patients.

    PubMed

    Luo, Ren; Wu, Vincent W C; He, Binghui; Gao, Xiaoying; Xu, Zhenxi; Wang, Dandan; Yang, Zhining; Li, Mei; Lin, Zhixiong

    2018-05-18

    The objectives of this study were to build a normal tissue complication probability (NTCP) model of radiation-induced hypothyroidism (RHT) for nasopharyngeal carcinoma (NPC) patients and to compare it with other four published NTCP models to evaluate its efficacy. Medical notes of 174 NPC patients after radiotherapy were reviewed. Biochemical hypothyroidism was defined as an elevated level of serum thyroid-stimulating hormone (TSH) value with a normal or decreased level of serum free thyroxine (fT4) after radiotherapy. Logistic regression with leave-one-out cross-validation was performed to establish the NTCP model. Model performance was evaluated and compared by the area under the receiver operating characteristic curve (AUC) in our NPC cohort. With a median follow-up of 24 months, 39 (22.4%) patients developed biochemical hypothyroidism. Gender, chemotherapy, the percentage thyroid volume receiving more than 50 Gy (V 50 ), and the maximum dose of the pituitary (P max ) were identified as the most predictive factors for RHT. A NTCP model based on these four parameters were developed. The model comparison was made in our NPC cohort and our NTCP model performed better in RHT prediction than the other four models. This study developed a four-variable NTCP model for biochemical hypothyroidism in NPC patients post-radiotherapy. Our NTCP model for RHT presents a high prediction capability. This is a retrospective study without registration.

  11. Development of Risk Insights for Regulatory Review of a Near-Surface Disposal Facility for Radioactive Waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esh, D.W.; Ridge, A.C.; Thaggard, M.

    2006-07-01

    Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA) requires the Department of Energy (DOE) to consult with the Nuclear Regulatory Commission (NRC) about non-High Level Waste (HLW) determinations. In its consultative role, NRC performs technical reviews of DOE's waste determinations but does not have regulatory authority over DOE's waste disposal activities. The safety of disposal is evaluated by comparing predicted disposal facility performance to the performance objectives specified in NRC regulations for the disposal of low-level waste (10 CFR Part 61 Subpart C). The performance objectives contain criteria for protection of themore » public, protection of inadvertent intruders, protection of workers, and stability of the disposal site after closure. The potential radiological dose to receptors typically is evaluated with a performance assessment (PA) model that simulates the release of radionuclides from the disposal site, transport of radionuclides through the environment, and exposure of potential receptors to residual contamination for thousands of years. This paper describes NRC's development and use of independent performance assessment modeling to facilitate review of DOE's non-HLW determination for the Saltstone Disposal Facility (SDF) at the Savannah River Site. NRC's review of the safety of near-surface disposal of radioactive waste at the SDF was facilitated and focused by risk insights developed with an independent PA model. The main components of NRC's performance assessment model are presented. The development of risk insights that allow the staff to focus review efforts on those areas that are most important to satisfying the performance objectives is discussed. Uncertainty analysis was performed of the full stochastic model using genetic variable selection algorithms. The results of the uncertainty analysis were then used to guide the development of simulations of other scenarios to understand the key risk drivers and risk limiters of the SDF. Review emphasis was placed on those aspects of the disposal system that were expected to drive performance: the physical and chemical performance of the cementitious wasteform and concrete vaults. Refinement of the modeling of the degradation and release from the cementitious wasteform had a significant effect on the predicted dose to a member of the public. (authors)« less

  12. Abstract - Cooperative Research and Development Agreement between Ames National Laboratory and National Energy Technology Laboratory AGMT-0609

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryden, Mark; Tucker, David A.

    The goal of this project is to develop a merged environment for simulation and analysis (MESA) at the National Energy Technology Laboratory’s (NETL) Hybrid Performance (Hyper) project laboratory. The MESA sensor lab developed as a component of this research will provide a development platform for investigating: 1) advanced control strategies, 2) testing and development of sensor hardware, 3) various modeling in-the-loop algorithms and 4) other advanced computational algorithms for improved plant performance using sensors, real-time models, and complex systems tools.

  13. A generalized development model for testing GPS user equipment

    NASA Technical Reports Server (NTRS)

    Hemesath, N.

    1978-01-01

    The generalized development model (GDM) program, which was intended to establish how well GPS user equipment can perform under a combination of jamming and dynamics, is described. The systems design and the characteristics of the GDM are discussed. The performance aspects of the GDM are listed and the application of the GDM to civil aviation is examined.

  14. Fuel Performance Experiments and Modeling: Fission Gas Bubble Nucleation and Growth in Alloy Nuclear Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean; Shao, Lin; Tsvetkov, Pavel

    2014-04-07

    Advanced fast reactor systems being developed under the DOE's Advanced Fuel Cycle Initiative are designed to destroy TRU isotopes generated in existing and future nuclear energy systems. Over the past 40 years, multiple experiments and demonstrations have been completed using U-Zr, U-Pu-Zr, U-Mo and other metal alloys. As a result, multiple empirical and semi-empirical relationships have been established to develop empirical performance modeling codes. Many mechanistic questions about fission as mobility, bubble coalescience, and gas release have been answered through industrial experience, research, and empirical understanding. The advent of modern computational materials science, however, opens new doors of development suchmore » that physics-based multi-scale models may be developed to enable a new generation of predictive fuel performance codes that are not limited by empiricism.« less

  15. Using a High-Performance Planning Model to Increase Levels of Functional Effectiveness Within Professional Development.

    PubMed

    Winter, Peggi

    2016-01-01

    Nursing professional practice models continue to shape how we practice nursing by putting families and members at the heart of everything we do. Faced with enormous challenges around healthcare reform, models create frameworks for practice by unifying, uniting, and guiding our nurses. The Kaiser Permanente Practice model was developed to ensure consistency for nursing practice across the continuum. Four key pillars support this practice model and the work of nursing: quality and safety, leadership, professional development, and research/evidence-based practice. These four pillars form the foundation that makes transformational practice possible and aligns nursing with Kaiser Permanente's mission. The purpose of this article is to discuss the pillar of professional development and the components of the Nursing Professional Development: Scope and Standards of Practice model (American Nurses Association & National Nursing Staff Development Organization, 2010) and place them in a five-level development framework. This process allowed us to identify the current organizational level of practice, prioritize each nursing professional development component, and design an operational strategy to move nursing professional development toward a level of high performance. This process is suggested for nursing professional development specialists.

  16. Validation of Storm Water Management Model Storm Control Measures Modules

    NASA Astrophysics Data System (ADS)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  17. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  18. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USDA-ARS?s Scientific Manuscript database

    The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...

  19. SUMMA and Model Mimicry: Understanding Differences Among Land Models

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.

    2016-12-01

    Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.

  20. Airframe Icing Research Gaps: NASA Perspective

    NASA Technical Reports Server (NTRS)

    Potapczuk, Mark

    2009-01-01

    qCurrent Airframe Icing Technology Gaps: Development of a full 3D ice accretion simulation model. Development of an improved simulation model for SLD conditions. CFD modeling of stall behavior for ice-contaminated wings/tails. Computational methods for simulation of stability and control parameters. Analysis of thermal ice protection system performance. Quantification of 3D ice shape geometric characteristics Development of accurate ground-based simulation of SLD conditions. Development of scaling methods for SLD conditions. Development of advanced diagnostic techniques for assessment of tunnel cloud conditions. Identification of critical ice shapes for aerodynamic performance degradation. Aerodynamic scaling issues associated with testing scale model ice shape geometries. Development of altitude scaling methods for thermal ice protections systems. Development of accurate parameter identification methods. Measurement of stability and control parameters for an ice-contaminated swept wing aircraft. Creation of control law modifications to prevent loss of control during icing encounters. 3D ice shape geometries. Collection efficiency data for ice shape geometries. SLD ice shape data, in-flight and ground-based, for simulation verification. Aerodynamic performance data for 3D geometries and various icing conditions. Stability and control parameter data for iced aircraft configurations. Thermal ice protection system data for simulation validation.

  1. Scripting MODFLOW Model Development Using Python and FloPy.

    PubMed

    Bakker, M; Post, V; Langevin, C D; Hughes, J D; White, J T; Starn, J J; Fienen, M N

    2016-09-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy. © 2016, National Ground Water Association.

  2. Comparison of Conventional and ANN Models for River Flow Forecasting

    NASA Astrophysics Data System (ADS)

    Jain, A.; Ganti, R.

    2011-12-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. River flow is generally estimated using time series or rainfall-runoff models. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been extensively adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conventional models. In this paper, a comparative study has been carried out for river flow forecasting using the conventional and ANN models. Among the conventional models, multiple linear, and non linear regression, and time series models of auto regressive (AR) type have been developed. Feed forward neural network model structure trained using the back propagation algorithm, a gradient search method, was adopted. The daily river flow data derived from Godavari Basin @ Polavaram, Andhra Pradesh, India have been employed to develop all the models included here. Two inputs, flows at two past time steps, (Q(t-1) and Q(t-2)) were selected using partial auto correlation analysis for forecasting flow at time t, Q(t). A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. It has been found that the regression and AR models performed comparably, and the ANN model performed the best amongst all the models investigated in this study. It is concluded that ANN model should be adopted in real catchments for hydrological modeling and forecasting.

  3. Evaluation of weighted regression and sample size in developing a taper model for loblolly pine

    Treesearch

    Kenneth L. Cormier; Robin M. Reich; Raymond L. Czaplewski; William A. Bechtold

    1992-01-01

    A stem profile model, fit using pseudo-likelihood weighted regression, was used to estimate merchantable volume of loblolly pine (Pinus taeda L.) in the southeast. The weighted regression increased model fit marginally, but did not substantially increase model performance. In all cases, the unweighted regression models performed as well as the...

  4. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  5. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  6. A Hybrid Actuation System Demonstrating Significantly Enhanced Electromechanical Performance

    NASA Technical Reports Server (NTRS)

    Su, Ji; Xu, Tian-Bing; Zhang, Shujun; Shrout, Thomas R.; Zhang, Qiming

    2004-01-01

    A hybrid actuation system (HYBAS) utilizing advantages of a combination of electromechanical responses of an electroactive polymer (EAP), an electrostrictive copolymer, and an electroactive ceramic single crystal, PZN-PT single crystal, has been developed. The system employs the contribution of the actuation elements cooperatively and exhibits a significantly enhanced electromechanical performance compared to the performances of the device made of each constituting material, the electroactive polymer or the ceramic single crystal, individually. The theoretical modeling of the performances of the HYBAS is in good agreement with experimental observation. The consistence between the theoretical modeling and experimental test make the design concept an effective route for the development of high performance actuating devices for many applications. The theoretical modeling, fabrication of the HYBAS and the initial experimental results will be presented and discussed.

  7. Solar power plant performance evaluation: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  8. Coupled modeling of the competitive gettering of transition metals and impact on performance of lifetime sensitive devices

    NASA Astrophysics Data System (ADS)

    Yazdani, Armin; Chen, Renyu; Dunham, Scott T.

    2017-03-01

    This work models competitive gettering of metals (Cu, Ni, Fe, Mo, and W) by boron, phosphorus, and dislocation loops, and connects those results directly to device performance. Density functional theory calculations were first performed to determine the binding energies of metals to the gettering sites, and based on that, continuum models were developed to model the redistribution and trapping of the metals. Our models found that Fe is most strongly trapped by the dislocation loops while Cu and Ni are most strongly trapped by the P4V clusters formed in high phosphorus concentrations. In addition, it is found that none of the mentioned gettering sites are effective in gettering Mo and W. The calculated metal redistribution along with the associated capture cross sections and trap energy levels are passed to device simulation via the recombination models to calculate carrier lifetime and the resulting device performance. Thereby, a comprehensive and predictive TCAD framework is developed to optimize the processing conditions to maximize performance of lifetime sensitive devices.

  9. The Performance Blueprint: An Integrated Logic Model Developed To Enhance Performance Measurement Literacy: The Case of Performance-Based Contract Management.

    ERIC Educational Resources Information Center

    Longo, Paul J.

    This study explored the mechanics of using an enhanced, comprehensive multipurpose logic model, the Performance Blueprint, as a means of building evaluation capacity, referred to in this paper as performance measurement literacy, to facilitate the attainment of both service-delivery oriented and community-oriented outcomes. The application of this…

  10. Mechanisms of Developmental Change in Infant Categorization

    ERIC Educational Resources Information Center

    Westermann, Gert; Mareschal, Denis

    2012-01-01

    Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…

  11. Formation of an internal model of environment dynamics during upper limb reaching movements: a fuzzy approach.

    PubMed

    MacDonald, Chad; Moussavi, Zahra; Sarkodie-Gyan, Thompson

    2007-01-01

    This paper presents the development and simulation of a fuzzy logic based learning mechanism to emulate human motor learning. In particular, fuzzy inference was used to develop an internal model of a novel dynamic environment experienced during planar reaching movements with the upper limb. A dynamic model of the human arm was developed and a fuzzy if-then rule base was created to relate trajectory movement and velocity errors to internal model update parameters. An experimental simulation was performed to compare the fuzzy system's performance with that of human subjects. It was found that the dynamic model behaved as expected, and the fuzzy learning mechanism created an internal model that was capable of opposing the environmental force field to regain a trajectory closely resembling the desired ideal.

  12. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  13. Development of a dynamic computational model of social cognitive theory.

    PubMed

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  14. How can animal models inform on the transition to chronic symptoms in whiplash?

    PubMed Central

    Winkelstein, Beth A.

    2011-01-01

    Study Design A non-systematic review of the literature. Objective The objective was to present general schema for mechanisms of whiplash pain and review the role of animal models in understanding the development of chronic pain from whiplash injury. Summary of Background Data Extensive biomechanical and clinical studies of whiplash have been performed to understand the injury mechanisms and symptoms of whiplash injury. However, only recently have animal models of this painful disorder been developed based on other pain models in the literature. Methods A non-systematic review was performed and findings were integrated to formulate a generalized picture of mechanisms by chronic whiplash pain develops from mechanical tissue injuries. Results The development of chronic pain from tissue injuries in the neck due to whiplash involves complex interactions between the injured tissue and spinal neuroimmune circuits. A variety of animal models are beginning to define these mechanisms. Conclusion Continued work is needed in developing appropriate animal models to investigate chronic pain from whiplash injuries and care must be taken to determine whether such models aim to model the injury event or the pain symptom. PMID:22020616

  15. Influence of Back-Up Bearings and Support Structure Dynamics on the Behavior of Rotors With Active Supports

    NASA Technical Reports Server (NTRS)

    Flowers, George T.

    1996-01-01

    This report presents a synopsis of the research work. Specific accomplishments are itemized below: (1) Experimental facilities have been developed. This includes a magnetic bearing test rig and an auxiliary bearing test rig. In addition, components have been designed, constructed, and tested for use with a rotordynamics test rig located at NASA Lewis Research Center. (2) A study of the rotordynamics of an auxiliary bearing supported T-501 engine model was performed. (3) An experimental/simulation study of auxiliary bearing rotordynamics has been performed. (4) A rotordynamical model for a magnetic bearing supported rotor system, including auxiliary bearing effects has been developed and simulation studies performed.(5) A finite element model for a foil bearing has been developed and studies of a rotor supported by foil bearings have been performed. (6) Two students affiliated with this project have graduated with M.S. degrees.

  16. Modelling cephalopod-inspired pulsed-jet locomotion for underwater soft robots.

    PubMed

    Renda, F; Giorgio-Serchi, F; Boyer, F; Laschi, C

    2015-09-28

    Cephalopods (i.e., octopuses and squids) are being looked upon as a source of inspiration for the development of unmanned underwater vehicles. One kind of cephalopod-inspired soft-bodied vehicle developed by the authors entails a hollow, elastic shell capable of performing a routine of recursive ingestion and expulsion of discrete slugs of fluids which enable the vehicle to propel itself in water. The vehicle performances were found to depend largely on the elastic response of the shell to the actuation cycle, thus motivating the development of a coupled propulsion-elastodynamics model of such vehicles. The model is developed and validated against a set of experimental results performed with the existing cephalopod-inspired prototypes. A metric of the efficiency of the propulsion routine which accounts for the elastic energy contribution during the ingestion/expulsion phases of the actuation is formulated. Demonstration on the use of this model to estimate the efficiency of the propulsion routine for various pulsation frequencies and for different morphologies of the vehicles are provided. This metric of efficiency, employed in association with the present elastodynamics model, provides a useful tool for performing a priori energetic analysis which encompass both the design specifications and the actuation pattern of this new kind of underwater vehicle.

  17. The Development of a Modelling Solution to Address Manpower and Personnel Issues Using the IPME

    DTIC Science & Technology

    2010-11-01

    training for a military system. It deals with the number of personnel spaces and available people. One of the main concerns in this domain is to...are often addressed by examining existing solutions for similar systems and/or a trial-and-error method based on human-in- the -loop tests. Such an...significant effort and resources on the development of a human performance modelling software, the Integrated Performance Modelling Environment (IPME

  18. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  19. MOGO: Model-Oriented Global Optimization of Petascale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Shende, Sameer S.

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less

  20. Vehicular traffic noise prediction using soft computing approach.

    PubMed

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Thermo-Mechanical and Electrochemistry Modeling of Planar SOFC Stacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Recknagle, Kurtis P.; Lin, Zijing

    2002-12-01

    Modeling activities at PNNL support design and development of modular SOFC systems. The SOFC stack modeling capability at PNNL has developed to a level at which planar stack designs can be compared and optimized for startup performance. Thermal-fluids and stress modeling is being performed to predict the transient temperature distribution and to determine the thermal stresses based on the temperature distribution. Current efforts also include the development of a model for calculating current density, cell voltage, and heat production in SOFC stacks with hydrogen or other fuels. The model includes the heat generation from both Joule heating and chemical reactions.more » It also accounts for species production and destruction via mass balance. The model is being linked to the finite element code MARC to allow for the evaluation of temperatures and stresses during steady state operations.« less

  2. Performance and Weight Estimates for an Advanced Open Rotor Engine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Tong, Michael T.

    2012-01-01

    NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.

  3. Physiologically Based Absorption Modeling to Design Extended-Release Clinical Products for an Ester Prodrug.

    PubMed

    Ding, Xuan; Day, Jeffrey S; Sperry, David C

    2016-11-01

    Absorption modeling has demonstrated its great value in modern drug product development due to its utility in understanding and predicting in vivo performance. In this case, we integrated physiologically based modeling in the development processes to effectively design extended-release (ER) clinical products for an ester prodrug LY545694. By simulating the trial results of immediate-release products, we delineated complex pharmacokinetics due to prodrug conversion and established an absorption model to describe the clinical observations. This model suggested the prodrug has optimal biopharmaceutical properties to warrant developing an ER product. Subsequently, we incorporated release profiles of prototype ER tablets into the absorption model to simulate the in vivo performance of these products observed in an exploratory trial. The models suggested that the absorption of these ER tablets was lower than the IR products because the extended release from the formulations prevented the drug from taking advantage of the optimal absorption window. Using these models, we formed a strategy to optimize the ER product to minimize the impact of the absorption window limitation. Accurate prediction of the performance of these optimized products by modeling was confirmed in a third clinical trial.

  4. All-in-one model for designing optimal water distribution pipe networks

    NASA Astrophysics Data System (ADS)

    Aklog, Dagnachew; Hosoi, Yoshihiko

    2017-05-01

    This paper discusses the development of an easy-to-use, all-in-one model for designing optimal water distribution networks. The model combines different optimization techniques into a single package in which a user can easily choose what optimizer to use and compare the results of different optimizers to gain confidence in the performances of the models. At present, three optimization techniques are included in the model: linear programming (LP), genetic algorithm (GA) and a heuristic one-by-one reduction method (OBORM) that was previously developed by the authors. The optimizers were tested on a number of benchmark problems and performed very well in terms of finding optimal or near-optimal solutions with a reasonable computation effort. The results indicate that the model effectively addresses the issues of complexity and limited performance trust associated with previous models and can thus be used for practical purposes.

  5. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and solutions that greatly aided our understanding. The software employed is based on Avizo Green, ParaView and SimVis, as well as own developed software extensions.

  6. Microworlds of the dynamic balanced scorecard for university (DBSC-UNI)

    NASA Astrophysics Data System (ADS)

    Hawari, Nurul Nazihah; Tahar, Razman Mat

    2015-12-01

    This research focuses on the development of a Microworlds of the dynamic balanced scorecard for university in order to enhance the university strategic planning process. To develop the model, we integrated both the balanced scorecard method and the system dynamics modelling method. Contrasting the traditional university planning tools, the developed model addresses university management problems holistically and dynamically. It is found that using system dynamics modelling method, the cause-and-effect relationships among variables related to the four conventional balanced scorecard perspectives are better understand. The dynamic processes that give rise to performance differences between targeted and actual performances also could be better understood. So, it is expected that the quality of the decisions taken are improved because of being better informed. The developed Microworlds can be exploited by university management to design policies that can positively influence the future in the direction of desired goals, and will have minimal side effects. This paper integrates balanced scorecard and system dynamics modelling methods in analyzing university performance. Therefore, this paper demonstrates the effectiveness and strength of system dynamics modelling method in solving problem in strategic planning area particularly in higher education sector.

  7. Use of third-party aircraft performance tools in the development of the Aviation Environmental Design Tool (AEDT).

    DOT National Transportation Integrated Search

    2011-07-01

    This report documents work done to enhance terminal area aircraft performance modeling in the Federal : Aviation Administration's Aviation Environmental Design Tool. A commercially available aircraft : performance software tool was used to develop da...

  8. Simulation and performance of brushless dc motor actuators

    NASA Astrophysics Data System (ADS)

    Gerba, A., Jr.

    1985-12-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparison of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good agreement. Plans for model improvement and testing of a motor-driven positioning device for model evaluation are outlined.

  9. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.

  10. A fuel-efficient cruise performance model for general aviation piston engine airplanes. Ph.D. Thesis. Final Report

    NASA Technical Reports Server (NTRS)

    Parkinson, R. C. H.

    1983-01-01

    A fuel-efficient cruise performance model which facilitates maximizing the specific range of General Aviation airplanes powered by spark-ignition piston engines and propellers is presented. Airplanes of fixed design only are considered. The uses and limitations of typical Pilot Operating Handbook cruise performance data, for constructing cruise performance models suitable for maximizing specific range, are first examined. These data are found to be inadequate for constructing such models. A new model of General Aviation piston-prop airplane cruise performance is then developed. This model consists of two subsystem models: the airframe-propeller-atmosphere subsystem model; and the engine-atmosphere subsystem model. The new model facilitates maximizing specific range; and by virtue of its implicity and low volume data storge requirements, appears suitable for airborne microprocessor implementation.

  11. Predicting stillbirth in a low resource setting.

    PubMed

    Kayode, Gbenga A; Grobbee, Diederick E; Amoakoh-Coleman, Mary; Adeleke, Ibrahim Taiwo; Ansah, Evelyn; de Groot, Joris A H; Klipstein-Grobusch, Kerstin

    2016-09-20

    Stillbirth is a major contributor to perinatal mortality and it is particularly common in low- and middle-income countries, where annually about three million stillbirths occur in the third trimester. This study aims to develop a prediction model for early detection of pregnancies at high risk of stillbirth. This retrospective cohort study examined 6,573 pregnant women who delivered at Federal Medical Centre Bida, a tertiary level of healthcare in Nigeria from January 2010 to December 2013. Descriptive statistics were performed and missing data imputed. Multivariable logistic regression was applied to examine the associations between selected candidate predictors and stillbirth. Discrimination and calibration were used to assess the model's performance. The prediction model was validated internally and over-optimism was corrected. We developed a prediction model for stillbirth that comprised maternal comorbidity, place of residence, maternal occupation, parity, bleeding in pregnancy, and fetal presentation. As a secondary analysis, we extended the model by including fetal growth rate as a predictor, to examine how beneficial ultrasound parameters would be for the predictive performance of the model. After internal validation, both calibration and discriminative performance of both the basic and extended model were excellent (i.e. C-statistic basic model = 0.80 (95 % CI 0.78-0.83) and extended model = 0.82 (95 % CI 0.80-0.83)). We developed a simple but informative prediction model for early detection of pregnancies with a high risk of stillbirth for early intervention in a low resource setting. Future research should focus on external validation of the performance of this promising model.

  12. The use of neural network technology to model swimming performance.

    PubMed

    Silva, António José; Costa, Aldo Manuel; Oliveira, Paulo Moura; Reis, Victor Machado; Saavedra, José; Perl, Jurgen; Rouboa, Abel; Marinho, Daniel Almeida

    2007-01-01

    to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons) and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females) of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility), swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics) and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron) with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances) is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports. Key pointsThe non-linear analysis resulting from the use of feed forward neural network allowed us the development of four performance models.The mean difference between the true and estimated results performed by each one of the four neural network models constructed was low.The neural network tool can be a good approach in the resolution of the performance modeling as an alternative to the standard statistical models that presume well-defined distributions and independence among all inputs.The use of neural networks for sports sciences application allowed us to create very realistic models for swimming performance prediction based on previous selected criterions that were related with the dependent variable (performance).

  13. Phosphoric acid fuel cell power plant system performance model and computer program

    NASA Technical Reports Server (NTRS)

    Alkasab, K. A.; Lu, C. Y.

    1984-01-01

    A FORTRAN computer program was developed for analyzing the performance of phosphoric acid fuel cell power plant systems. Energy mass and electrochemical analysis in the reformer, the shaft converters, the heat exchangers, and the fuel cell stack were combined to develop a mathematical model for the power plant for both atmospheric and pressurized conditions, and for several commercial fuels.

  14. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  15. Study of Adaptive Mathematical Models for Deriving Automated Pilot Performance Measurement Techniques. Volume I. Model Development.

    ERIC Educational Resources Information Center

    Connelly, Edward A.; And Others

    A new approach to deriving human performance measures and criteria for use in automatically evaluating trainee performance is documented in this report. The ultimate application of the research is to provide methods for automatically measuring pilot performance in a flight simulator or from recorded in-flight data. An efficient method of…

  16. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errorsmore » of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.« less

  17. Multitasking TORT under UNICOS: Parallel performance models and measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, A.; Azmy, Y.Y.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  18. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azmy, Y.Y.; Barnett, D.A.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  19. Applications of psychophysical models to the study of auditory development

    NASA Astrophysics Data System (ADS)

    Werner, Lynne

    2003-04-01

    Psychophysical models of listening, such as the energy detector model, have provided a framework from which to characterize the function of the mature auditory system and to explore how mature listeners make use of auditory information in sound identification. The application of such models to the study of auditory development has similarly provided insight into the characteristics of infant hearing and listening. Infants intensity, frequency, temporal and spatial resolution have been described at least grossly and some contributions of immature listening strategies to infant hearing have been identified. Infants psychoacoustic performance is typically poorer than adults under identical stimulus conditions. However, the infant's performance typically varies with stimulus condition in a way that is qualitatively similar to the adult's performance. In some cases, though, infants perform in a qualitatively different way from adults in psychoacoustic experiments. Further, recent psychoacoustic studies of children suggest that the classic models of listening may be inadequate to describe the children's performance. The characteristics of a model that might be appropriate for the immature listener will be outlined and the implications for models of mature listening will be discussed. [Work supported by NIH grants DC00396 and by DC04661.

  20. A comparison of performance of several artificial intelligence methods for forecasting monthly discharge time series

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin

    2009-08-01

    SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.

  1. Development and verification of a model for estimating the screening utility in the detection of PCBs in transformer oil.

    PubMed

    Terakado, Shingo; Glass, Thomas R; Sasaki, Kazuhiro; Ohmura, Naoya

    2014-01-01

    A simple new model for estimating the screening performance (false positive and false negative rates) of a given test for a specific sample population is presented. The model is shown to give good results on a test population, and is used to estimate the performance on a sampled population. Using the model developed in conjunction with regulatory requirements and the relative costs of the confirmatory and screening tests allows evaluation of the screening test's utility in terms of cost savings. Testers can use the methods developed to estimate the utility of a screening program using available screening tests with their own sample populations.

  2. Determination of photovoltaic concentrator optical design specifications using performance modeling

    NASA Astrophysics Data System (ADS)

    Kerschen, Kevin A.; Levy, Sheldon L.

    The strategy used to develop an optical design specification for a 500X concentration photovoltaic module to be used with a 28-percent-efficient concentrator photovoltaic cell is reported. The computer modeling code (PVOPTICS) developed for this purpose, a Fresnel lens design strategy, and optical component specification procedures are described. Comparisons are made between the predicted performance and the measured performance of components fabricated to those specifications. An acrylic lens and a reflective secondary optical element have been tested, showing efficiencies exceeding 88 percent.

  3. NASA's Cryogenic Fluid Management Technology Project

    NASA Technical Reports Server (NTRS)

    Tramel, Terri L.; Motil, Susan M.

    2008-01-01

    The Cryogenic Fluid Management (CFM) Project's primary objective is to develop storage, transfer, and handling technologies for cryogens that will support the enabling of high performance cryogenic propulsion systems, lunar surface systems and economical ground operations. Such technologies can significantly reduce propellant launch mass and required on-orbit margins, reduce or even eliminate propellant tank fluid boil-off losses for long term missions, and simplify vehicle operations. This paper will present the status of the specific technologies that the CFM Project is developing. The two main areas of concentration are analysis models development and CFM hardware development. The project develops analysis tools and models based on thermodynamics, hydrodynamics, and existing flight/test data. These tools assist in the development of pressure/thermal control devices (such as the Thermodynamic Vent System (TVS), and Multi-layer insulation); with the ultimate goal being to develop a mature set of tools and models that can characterize the performance of the pressure/thermal control devices incorporated in the design of an entire CFM system with minimal cryogen loss. The project does hardware development and testing to verify our understanding of the physical principles involved, and to validate the performance of CFM components, subsystems and systems. This database provides information to anchor our analytical models. This paper describes some of the current activities of the NASA's Cryogenic Fluid Management Project.

  4. Combining Simulation and Optimization Models for Hardwood Lumber Production

    Treesearch

    G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman

    1991-01-01

    Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...

  5. The model for Fundamentals of Endovascular Surgery (FEVS) successfully defines the competent endovascular surgeon.

    PubMed

    Duran, Cassidy; Estrada, Sean; O'Malley, Marcia; Sheahan, Malachi G; Shames, Murray L; Lee, Jason T; Bismuth, Jean

    2015-12-01

    Fundamental skills testing is now required for certification in general surgery. No model for assessing fundamental endovascular skills exists. Our objective was to develop a model that tests the fundamental endovascular skills and differentiates competent from noncompetent performance. The Fundamentals of Endovascular Surgery model was developed in silicon and virtual-reality versions. Twenty individuals (with a range of experience) performed four tasks on each model in three separate sessions. Tasks on the silicon model were performed under fluoroscopic guidance, and electromagnetic tracking captured motion metrics for catheter tip position. Image processing captured tool tip position and motion on the virtual model. Performance was evaluated using a global rating scale, blinded video assessment of error metrics, and catheter tip movement and position. Motion analysis was based on derivations of speed and position that define proficiency of movement (spectral arc length, duration of submovement, and number of submovements). Performance was significantly different between competent and noncompetent interventionalists for the three performance measures of motion metrics, error metrics, and global rating scale. The mean error metric score was 6.83 for noncompetent individuals and 2.51 for the competent group (P < .0001). Median global rating scores were 2.25 for the noncompetent group and 4.75 for the competent users (P < .0001). The Fundamentals of Endovascular Surgery model successfully differentiates competent and noncompetent performance of fundamental endovascular skills based on a series of objective performance measures. This model could serve as a platform for skills testing for all trainees. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  6. Utilizing soil polypedons to improve model performance for digital soil mapping

    USDA-ARS?s Scientific Manuscript database

    Most digital soil mapping approaches that use point data to develop relationships with covariate data intersect sample locations with one raster pixel regardless of pixel size. Resulting models are subject to spurious values in covariate data which may limit model performance. An alternative approac...

  7. A Generative Approach to the Development of Hidden-Figure Items.

    ERIC Educational Resources Information Center

    Bejar, Issac I.; Yocom, Peter

    This report explores an approach to item development and psychometric modeling which explicitly incorporates knowledge about the mental models used by examinees in the solution of items into a psychometric model that characterize performances on a test, as well as incorporating that knowledge into the item development process. The paper focuses on…

  8. Theoretical Model of Development of Information Competence among Students Enrolled in Elective Courses

    ERIC Educational Resources Information Center

    Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis

    2016-01-01

    The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…

  9. Prediction Models for 30-Day Mortality and Complications After Total Knee and Hip Arthroplasties for Veteran Health Administration Patients With Osteoarthritis.

    PubMed

    Harris, Alex Hs; Kuo, Alfred C; Bowe, Thomas; Gupta, Shalini; Nordin, David; Giori, Nicholas J

    2018-05-01

    Statistical models to preoperatively predict patients' risk of death and major complications after total joint arthroplasty (TJA) could improve the quality of preoperative management and informed consent. Although risk models for TJA exist, they have limitations including poor transparency and/or unknown or poor performance. Thus, it is currently impossible to know how well currently available models predict short-term complications after TJA, or if newly developed models are more accurate. We sought to develop and conduct cross-validation of predictive risk models, and report details and performance metrics as benchmarks. Over 90 preoperative variables were used as candidate predictors of death and major complications within 30 days for Veterans Health Administration patients with osteoarthritis who underwent TJA. Data were split into 3 samples-for selection of model tuning parameters, model development, and cross-validation. C-indexes (discrimination) and calibration plots were produced. A total of 70,569 patients diagnosed with osteoarthritis who received primary TJA were included. C-statistics and bootstrapped confidence intervals for the cross-validation of the boosted regression models were highest for cardiac complications (0.75; 0.71-0.79) and 30-day mortality (0.73; 0.66-0.79) and lowest for deep vein thrombosis (0.59; 0.55-0.64) and return to the operating room (0.60; 0.57-0.63). Moderately accurate predictive models of 30-day mortality and cardiac complications after TJA in Veterans Health Administration patients were developed and internally cross-validated. By reporting model coefficients and performance metrics, other model developers can test these models on new samples and have a procedure and indication-specific benchmark to surpass. Published by Elsevier Inc.

  10. Validation of a Monte Carlo simulation of the Philips Allegro/GEMINI PET systems using GATE

    NASA Astrophysics Data System (ADS)

    Lamare, F.; Turzo, A.; Bizais, Y.; Cheze LeRest, C.; Visvikis, D.

    2006-02-01

    A newly developed simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop a Monte Carlo simulation of a fully three-dimensional (3D) clinical PET scanner. The Philips Allegro/GEMINI PET systems were simulated in order to (a) allow a detailed study of the parameters affecting the system's performance under various imaging conditions, (b) study the optimization and quantitative accuracy of emission acquisition protocols for dynamic and static imaging, and (c) further validate the potential of GATE for the simulation of clinical PET systems. A model of the detection system and its geometry was developed. The accuracy of the developed detection model was tested through the comparison of simulated and measured results obtained with the Allegro/GEMINI systems for a number of NEMA NU2-2001 performance protocols including spatial resolution, sensitivity and scatter fraction. In addition, an approximate model of the system's dead time at the level of detected single events and coincidences was developed in an attempt to simulate the count rate related performance characteristics of the scanner. The developed dead-time model was assessed under different imaging conditions using the count rate loss and noise equivalent count rates performance protocols of standard and modified NEMA NU2-2001 (whole body imaging conditions) and NEMA NU2-1994 (brain imaging conditions) comparing simulated with experimental measurements obtained with the Allegro/GEMINI PET systems. Finally, a reconstructed image quality protocol was used to assess the overall performance of the developed model. An agreement of <3% was obtained in scatter fraction, with a difference between 4% and 10% in the true and random coincidence count rates respectively, throughout a range of activity concentrations and under various imaging conditions, resulting in <8% differences between simulated and measured noise equivalent count rates performance. Finally, the image quality validation study revealed a good agreement in signal-to-noise ratio and contrast recovery coefficients for a number of different volume spheres and two different (clinical level based) tumour-to-background ratios. In conclusion, these results support the accurate modelling of the Philips Allegro/GEMINI PET systems using GATE in combination with a dead-time model for the signal flow description, which leads to an agreement of <10% in coincidence count rates under different imaging conditions and clinically relevant activity concentration levels.

  11. Performance Improvement [in HRD].

    ERIC Educational Resources Information Center

    1995

    These four papers are from a symposium that was facilitated by Richard J. Torraco at the 1995 conference of the Academy of Human Resource Development (HRD). "Performance Technology--Isn't It Time We Found Some New Models?" (William J. Rothwell) reviews briefly two classic models, describes criteria for the high performance workplace…

  12. A Composite Model for Employees' Performance Appraisal and Improvement

    ERIC Educational Resources Information Center

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  13. An Integrated Performance-Based Budgeting Model for Thai Higher Education

    ERIC Educational Resources Information Center

    Charoenkul, Nantarat; Siribanpitak, Pruet

    2012-01-01

    This research mainly aims to develop an administrative model of performance-based budgeting for autonomous state universities. The sample population in this study covers 4 representatives of autonomous state universities from 4 regions of Thailand, where the performance-based budgeting system has been fully practiced. The research informants…

  14. The Role of Citizenship Performance in Academic Achievement and Graduate Employability

    ERIC Educational Resources Information Center

    Poropat, Arthur E.

    2011-01-01

    Purpose: Employability is a major educational goal, but employability programmes emphasise skill development, while employers value performance. Education acts as a model for employment, so educational performance assessment should be aligned with employment models. Consequently, the aim of this paper is to examine the relationship between…

  15. Performance evaluation of automated manufacturing systems using generalized stochastic Petri Nets. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Al-Jaar, Robert Y.; Desrochers, Alan A.

    1989-01-01

    The main objective of this research is to develop a generic modeling methodology with a flexible and modular framework to aid in the design and performance evaluation of integrated manufacturing systems using a unified model. After a thorough examination of the available modeling methods, the Petri Net approach was adopted. The concurrent and asynchronous nature of manufacturing systems are easily captured by Petri Net models. Three basic modules were developed: machine, buffer, and Decision Making Unit. The machine and buffer modules are used for modeling transfer lines and production networks. The Decision Making Unit models the functions of a computer node in a complex Decision Making Unit Architecture. The underlying model is a Generalized Stochastic Petri Net (GSPN) that can be used for performance evaluation and structural analysis. GSPN's were chosen because they help manage the complexity of modeling large manufacturing systems. There is no need to enumerate all the possible states of the Markov Chain since they are automatically generated from the GSPN model.

  16. Development of a Higher Fidelity Model for the Cascade Distillation Subsystem (CDS)

    NASA Technical Reports Server (NTRS)

    Perry, Bruce; Anderson, Molly

    2014-01-01

    Significant improvements have been made to the ACM model of the CDS, enabling accurate predictions of dynamic operations with fewer assumptions. The model has been utilized to predict how CDS performance would be impacted by changing operating parameters, revealing performance trade-offs and possibilities for improvement. CDS efficiency is driven by the THP coefficient of performance, which in turn is dependent on heat transfer within the system. Based on the remaining limitations of the simulation, priorities for further model development include: center dot Relaxing the assumption of total condensation center dot Incorporating dynamic simulation capability for the buildup of dissolved inert gasses in condensers center dot Examining CDS operation with more complex feeds center dot Extending heat transfer analysis to all surfaces

  17. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  18. ILS Glide Slope Performance Prediction. Volume B

    DTIC Science & Technology

    1974-09-01

    figures are identical in both volumes. 󈧔. Abottec A mathematical model for predicting the performance of ILS glide slope arrays in the presence of...irregularities on the performance of ILS Glide Slope antenna systems, a mathematical -electromagnetic scattering computer model has been developed. This work was...Antenna ........... 4-4 9. Test Case Results ..................................... r-3 ix PART I. IEO -j 1.INTRODUCTION IA mathematical model has been

  19. Visual performance modeling in the human operator simulator

    NASA Technical Reports Server (NTRS)

    Strieb, M. I.

    1979-01-01

    A brief description of the history of the development of the human operator simulator (HOS) model is presented. Features of the HOS micromodels that impact on the obtainment of visual performance data are discussed along with preliminary details on a HOS pilot model designed to predict the results of visual performance workload data obtained through oculometer studies on pilots in real and simulated approaches and landings.

  20. Modeling and Performance Considerations for Automated Fault Isolation in Complex Systems

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Oostdyk, Rebecca

    2010-01-01

    The purpose of this paper is to document the modeling considerations and performance metrics that were examined in the development of a large-scale Fault Detection, Isolation and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FDIR team members developed a set of operational requirements for the models that would be used for fault isolation and worked closely with the vendor of the software tools selected for fault isolation to ensure that the software was able to meet the requirements. Once the requirements were established, example models of sufficient complexity were used to test the performance of the software. The results of the performance testing demonstrated the need for enhancements to the software in order to meet the demands of the full-scale ground and vehicle FDIR system. The paper highlights the importance of the development of operational requirements and preliminary performance testing as a strategy for identifying deficiencies in highly scalable systems and rectifying those deficiencies before they imperil the success of the project

  1. Modeling the seasonal circulation in Massachusetts Bay

    USGS Publications Warehouse

    Signell, Richard P.; Jenter, Harry L.; Blumberg, Alan F.; ,

    1994-01-01

    An 18 month simulation of circulation was conducted in Massachusetts Bay, a roughly 35 m deep, 100??50 km embayment on the northeastern shelf of the United States. Using a variant of the Blumberg-Mellor (1987) model, it was found that a continuous 18 month run was only possible if the velocity field was Shapiro filtered to remove two grid length energy that developed along the open boundary due to mismatch in locally generated and climatologically forced water properties. The seasonal development of temperature and salinity stratification was well-represented by the model once ??-coordinate errors were reduced by subtracting domain averaged vertical profiles of temperature, salinity and density before horizontal differencing was performed. Comparison of modeled and observed subtidal currents at fixed locations revealed that the model performance varies strongly with season and distance from the open boundaries. The model performs best during unstratified conditions, and in the interior of the bay. The model performs poorest during stratified conditions and in the regions where the bay is driven predominantly by remote fluctuations from the Gulf of Maine.

  2. Review and verification of CARE 3 mathematical model and code

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.

    1983-01-01

    The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.

  3. High-level PC-based laser system modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  4. Performance model for grid-connected photovoltaic inverters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyson, William Earl; Galbraith, Gary M.; King, David L.

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurementsmore » conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.« less

  5. Stage-by-Stage and Parallel Flow Path Compressor Modeling for a Variable Cycle Engine

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Cheng, Larry

    2015-01-01

    This paper covers the development of stage-by-stage and parallel flow path compressor modeling approaches for a Variable Cycle Engine. The stage-by-stage compressor modeling approach is an extension of a technique for lumped volume dynamics and performance characteristic modeling. It was developed to improve the accuracy of axial compressor dynamics over lumped volume dynamics modeling. The stage-by-stage compressor model presented here is formulated into a parallel flow path model that includes both axial and rotational dynamics. This is done to enable the study of compressor and propulsion system dynamic performance under flow distortion conditions. The approaches utilized here are generic and should be applicable for the modeling of any axial flow compressor design.

  6. Development of a High Fidelity Dynamic Module of the Advanced Resistive Exercise Device (ARED) Using Adams

    NASA Technical Reports Server (NTRS)

    Humphreys, B. T.; Thompson, W. K.; Lewandowski, B. E.; Cadwell, E. E.; Newby, N. J.; Fincke, R. S.; Sheehan, C.; Mulugeta, L.

    2012-01-01

    NASA's Digital Astronaut Project (DAP) implements well-vetted computational models to predict and assess spaceflight health and performance risks, and enhance countermeasure development. DAP provides expertise and computation tools to its research customers for model development, integration, or analysis. DAP is currently supporting the NASA Exercise Physiology and Countermeasures (ExPC) project by integrating their biomechanical models of specific exercise movements with dynamic models of the devices on which the exercises were performed. This presentation focuses on the development of a high fidelity dynamic module of the Advanced Resistive Exercise Device (ARED) on board the ISS. The ARED module, illustrated in the figure below, was developed using the Adams (MSC Santa Ana, California) simulation package. The Adams package provides the capabilities to perform multi rigid body, flexible body, and mixed dynamic analyses of complex mechanisms. These capabilities were applied to accurately simulate: Inertial and mass properties of the device such as the vibration isolation system (VIS) effects and other ARED components, Non-linear joint friction effects, The gas law dynamics of the vacuum cylinders and VIS components using custom written differential state equations, The ARED flywheel dynamics, including torque limiting clutch. Design data from the JSC ARED Engineering team was utilized in developing the model. This included solid modeling geometry files, component/system specifications, engineering reports and available data sets. The Adams ARED module is importable into LifeMOD (Life Modeler, Inc., San Clemente, CA) for biomechanical analyses of different resistive exercises such as squat and dead-lift. Using motion capture data from ground test subjects, the ExPC developed biomechanical exercise models in LifeMOD. The Adams ARED device module was then integrated with the exercise subject model into one integrated dynamic model. This presentation will describe the development of the Adams ARED module including its capabilities, limitations, and assumptions. Preliminary results, validation activities, and a practical application of the module to inform the relative effect of the flywheels on exercise will be discussed.

  7. A Motivation Contract Model of Employee Appraisal.

    ERIC Educational Resources Information Center

    Glenn, Robert B.

    The purpose of this paper is to develop a process model for identification and assessment of employee job performance, through motivation contracting. The model integrated various components of expectancy theories of motivation and performance contracting and is based on humanistic assumptions about the nature of people. More specifically, the…

  8. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  9. Dual Arm Work Package performance estimates and telerobot task network simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draper, J.V.; Blair, L.M.

    1997-02-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less

  10. Development of a novel ex vivo porcine laparoscopic Heller myotomy and Nissen fundoplication training model (Toronto lap-Nissen simulator).

    PubMed

    Ujiie, Hideki; Kato, Tatsuya; Hu, Hsin-Pei; Bauer, Patrycja; Patel, Priya; Wada, Hironobu; Lee, Daiyoon; Fujino, Kosuke; Schieman, Colin; Pierre, Andrew; Waddell, Thomas K; Keshavjee, Shaf; Darling, Gail E; Yasufuku, Kazuhiro

    2017-06-01

    Surgical trainees are required to develop competency in a variety of laparoscopic operations. Developing laparoscopic technical skills can be difficult as there has been a decrease in the number of procedures performed. This study aims to develop an inexpensive and anatomically relevant model for training in laparoscopic foregut procedures. An ex vivo , anatomic model of the human upper abdomen was developed using intact porcine esophagus, stomach, diaphragm and spleen. The Toronto lap-Nissen simulator was contained in a laparoscopic box-trainer and included an arch system to simulate the normal radial shape and tension of the diaphragm. We integrated the use of this training model as a part of our laparoscopic skills laboratory-training curriculum. Afterwards, we surveyed trainees to evaluate the observed benefit of the learning session. Twenty-five trainees and five faculty members completed a survey regarding the use of this model. Among the trainees, only 4 (16%) had experience with laparoscopic Heller myotomy and Nissen fundoplication. They reported that practicing with the model was a valuable use of their limited time, repeating the exercise would be of additional benefit, and that the exercise improved their ability to perform or assist in an actual case in the operating room. Significant improvements were found in the following subjective measures comparing pre- vs. post-training: (I) knowledge level (5.6 vs. 8.0, P<0.001); (II) comfort level in assisting (6.3 vs. 7.6, P<0.001); and (III) comfort level in performing as the primary surgeon (4.9 vs. 7.1, P<0.001). The trainees and faculty members agreed that this model was of adequate fidelity and was a representative simulation of actual human anatomy. We developed an easily reproducible training model for laparoscopic procedures. This simulator reproduces human anatomy and increases the trainees' comfort level in performing and assisting with myotomy and fundoplication.

  11. Development of a novel ex vivo porcine laparoscopic Heller myotomy and Nissen fundoplication training model (Toronto lap-Nissen simulator)

    PubMed Central

    Ujiie, Hideki; Kato, Tatsuya; Hu, Hsin-Pei; Bauer, Patrycja; Patel, Priya; Wada, Hironobu; Lee, Daiyoon; Fujino, Kosuke; Schieman, Colin; Pierre, Andrew; Waddell, Thomas K.; Keshavjee, Shaf; Darling, Gail E.

    2017-01-01

    Background Surgical trainees are required to develop competency in a variety of laparoscopic operations. Developing laparoscopic technical skills can be difficult as there has been a decrease in the number of procedures performed. This study aims to develop an inexpensive and anatomically relevant model for training in laparoscopic foregut procedures. Methods An ex vivo, anatomic model of the human upper abdomen was developed using intact porcine esophagus, stomach, diaphragm and spleen. The Toronto lap-Nissen simulator was contained in a laparoscopic box-trainer and included an arch system to simulate the normal radial shape and tension of the diaphragm. We integrated the use of this training model as a part of our laparoscopic skills laboratory-training curriculum. Afterwards, we surveyed trainees to evaluate the observed benefit of the learning session. Results Twenty-five trainees and five faculty members completed a survey regarding the use of this model. Among the trainees, only 4 (16%) had experience with laparoscopic Heller myotomy and Nissen fundoplication. They reported that practicing with the model was a valuable use of their limited time, repeating the exercise would be of additional benefit, and that the exercise improved their ability to perform or assist in an actual case in the operating room. Significant improvements were found in the following subjective measures comparing pre- vs. post-training: (I) knowledge level (5.6 vs. 8.0, P<0.001); (II) comfort level in assisting (6.3 vs. 7.6, P<0.001); and (III) comfort level in performing as the primary surgeon (4.9 vs. 7.1, P<0.001). The trainees and faculty members agreed that this model was of adequate fidelity and was a representative simulation of actual human anatomy. Conclusions We developed an easily reproducible training model for laparoscopic procedures. This simulator reproduces human anatomy and increases the trainees’ comfort level in performing and assisting with myotomy and fundoplication. PMID:28740664

  12. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hector, Jr., Louis G.; McCarty, Eric D.

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowingmore » objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.« less

  13. Assessment of predictive models for chlorophyll-a concentration of a tropical lake

    PubMed Central

    2011-01-01

    Background This study assesses four predictive ecological models; Fuzzy Logic (FL), Recurrent Artificial Neural Network (RANN), Hybrid Evolutionary Algorithm (HEA) and multiple linear regressions (MLR) to forecast chlorophyll- a concentration using limnological data from 2001 through 2004 of unstratified shallow, oligotrophic to mesotrophic tropical Putrajaya Lake (Malaysia). Performances of the models are assessed using Root Mean Square Error (RMSE), correlation coefficient (r), and Area under the Receiving Operating Characteristic (ROC) curve (AUC). Chlorophyll-a have been used to estimate algal biomass in aquatic ecosystem as it is common in most algae. Algal biomass indicates of the trophic status of a water body. Chlorophyll- a therefore, is an effective indicator for monitoring eutrophication which is a common problem of lakes and reservoirs all over the world. Assessments of these predictive models are necessary towards developing a reliable algorithm to estimate chlorophyll- a concentration for eutrophication management of tropical lakes. Results Same data set was used for models development and the data was divided into two sets; training and testing to avoid biasness in results. FL and RANN models were developed using parameters selected through sensitivity analysis. The selected variables were water temperature, pH, dissolved oxygen, ammonia nitrogen, nitrate nitrogen and Secchi depth. Dissolved oxygen, selected through stepwise procedure, was used to develop the MLR model. HEA model used parameters selected using genetic algorithm (GA). The selected parameters were pH, Secchi depth, dissolved oxygen and nitrate nitrogen. RMSE, r, and AUC values for MLR model were (4.60, 0.5, and 0.76), FL model were (4.49, 0.6, and 0.84), RANN model were (4.28, 0.7, and 0.79) and HEA model were (4.27, 0.7, and 0.82) respectively. Performance inconsistencies between four models in terms of performance criteria in this study resulted from the methodology used in measuring the performance. RMSE is based on the level of error of prediction whereas AUC is based on binary classification task. Conclusions Overall, HEA produced the best performance in terms of RMSE, r, and AUC values. This was followed by FL, RANN, and MLR. PMID:22372859

  14. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  15. Land use regression models to assess air pollution exposure in Mexico City using finer spatial and temporal input parameters.

    PubMed

    Son, Yeongkwon; Osornio-Vargas, Álvaro R; O'Neill, Marie S; Hystad, Perry; Texcalac-Sangrador, José L; Ohman-Strickland, Pamela; Meng, Qingyu; Schwander, Stephan

    2018-05-17

    The Mexico City Metropolitan Area (MCMA) is one of the largest and most populated urban environments in the world and experiences high air pollution levels. To develop models that estimate pollutant concentrations at fine spatiotemporal scales and provide improved air pollution exposure assessments for health studies in Mexico City. We developed finer spatiotemporal land use regression (LUR) models for PM 2.5 , PM 10 , O 3 , NO 2 , CO and SO 2 using mixed effect models with the Least Absolute Shrinkage and Selection Operator (LASSO). Hourly traffic density was included as a temporal variable besides meteorological and holiday variables. Models of hourly, daily, monthly, 6-monthly and annual averages were developed and evaluated using traditional and novel indices. The developed spatiotemporal LUR models yielded predicted concentrations with good spatial and temporal agreements with measured pollutant levels except for the hourly PM 2.5 , PM 10 and SO 2 . Most of the LUR models met performance goals based on the standardized indices. LUR models with temporal scales greater than one hour were successfully developed using mixed effect models with LASSO and showed superior model performance compared to earlier LUR models, especially for time scales of a day or longer. The newly developed LUR models will be further refined with ongoing Mexico City air pollution sampling campaigns to improve personal exposure assessments. Copyright © 2018. Published by Elsevier B.V.

  16. Distributed parameter modeling to prevent charge cancellation for discrete thickness piezoelectric energy harvester

    NASA Astrophysics Data System (ADS)

    Krishnasamy, M.; Qian, Feng; Zuo, Lei; Lenka, T. R.

    2018-03-01

    The charge cancellation due to the change of strain along single continuous piezoelectric layer can remarkably affect the performance of a cantilever based harvester. In this paper, analytical models using distributed parameters are developed with some extent of averting the charge cancellation in cantilever piezoelectric transducer where the piezoelectric layers are segmented at strain nodes of concerned vibration mode. The electrode of piezoelectric segments are parallelly connected with a single external resistive load in the 1st model (Model 1). While each bimorph piezoelectric layers are connected in parallel to a resistor to form an independent circuit in the 2nd model (Model 2). The analytical expressions of the closed-form electromechanical coupling responses in frequency domain under harmonic base excitation are derived based on the Euler-Bernoulli beam assumption for both models. The developed analytical models are validated by COMSOL and experimental results. The results demonstrate that the energy harvesting performance of the developed segmented piezoelectric layer models is better than the traditional model of continuous piezoelectric layer.

  17. Development of a Human Motor Model for the Evaluation of an Integrated Alerting and Notification Flight Deck System

    NASA Technical Reports Server (NTRS)

    Daiker, Ron; Schnell, Thomas

    2010-01-01

    A human motor model was developed on the basis of performance data that was collected in a flight simulator. The motor model is under consideration as one component of a virtual pilot model for the evaluation of NextGen crew alerting and notification systems in flight decks. This model may be used in a digital Monte Carlo simulation to compare flight deck layout design alternatives. The virtual pilot model is being developed as part of a NASA project to evaluate multiple crews alerting and notification flight deck configurations. Model parameters were derived from empirical distributions of pilot data collected in a flight simulator experiment. The goal of this model is to simulate pilot motor performance in the approach-to-landing task. The unique challenges associated with modeling the complex dynamics of humans interacting with the cockpit environment are discussed, along with the current state and future direction of the model.

  18. Estimating Traffic Accidents in Turkey Using Differential Evolution Algorithm

    NASA Astrophysics Data System (ADS)

    Akgüngör, Ali Payıdar; Korkmaz, Ersin

    2017-06-01

    Estimating traffic accidents play a vital role to apply road safety procedures. This study proposes Differential Evolution Algorithm (DEA) models to estimate the number of accidents in Turkey. In the model development, population (P) and the number of vehicles (N) are selected as model parameters. Three model forms, linear, exponential and semi-quadratic models, are developed using DEA with the data covering from 2000 to 2014. Developed models are statistically compared to select the best fit model. The results of the DE models show that the linear model form is suitable to estimate the number of accidents. The statistics of this form is better than other forms in terms of performance criteria which are the Mean Absolute Percentage Errors (MAPE) and the Root Mean Square Errors (RMSE). To investigate the performance of linear DE model for future estimations, a ten-year period from 2015 to 2024 is considered. The results obtained from future estimations reveal the suitability of DE method for road safety applications.

  19. Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit

    2016-06-01

    Transposition models are widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic (PV) panels. These transposition models have been developed using various assumptions about the distribution of the diffuse radiation, and most of the parameterizations in these models have been developed using hourly ground data sets. Numerous studies have compared the performance of transposition models, but this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty using high-resolution ground measurements in the plane of array. Our results suggest that the amount of aerosol optical depthmore » can affect the accuracy of isotropic models. The choice of empirical coefficients and the use of decomposition models can both result in uncertainty in the output from the transposition models. It is expected that the results of this study will ultimately lead to improvements of the parameterizations as well as the development of improved physical models.« less

  20. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  1. Open Innovation at NASA: A New Business Model for Advancing Human Health and Performance Innovations

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth E.; Keeton, Kathryn E.

    2014-01-01

    This paper describes a new business model for advancing NASA human health and performance innovations and demonstrates how open innovation shaped its development. A 45 percent research and technology development budget reduction drove formulation of a strategic plan grounded in collaboration. We describe the strategy execution, including adoption and results of open innovation initiatives, the challenges of cultural change, and the development of virtual centers and a knowledge management tool to educate and engage the workforce and promote cultural change.

  2. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  3. A numerical investigation on the influence of engine shape and mixing processes on wave engine performance

    NASA Astrophysics Data System (ADS)

    Erickson, Robert R.

    Wave engines are a class of unsteady, air-breathing propulsion devices that use an intermittent combustion process to generate thrust. The inherently simple mechanical design of the wave engine allows for a relatively low cost per unit propulsion system, yet unsatisfactory overall performance has severely limited the development of commercially successful wave engines. The primary objective of this investigation was to develop a more detailed physical understanding of the influence of gas dynamic nonlinearities, unsteady combustion processes, and engine shape on overall wave engine performance. Within this study, several numerical models were developed and applied to wave engines and related applications. The first portion of this investigation examined the influence of duct shape on driven oscillations in acoustic compression devices, which represent a simplified physical system closely related in several ways to the wave engine. A numerical model based on an application of the Galerkin method was developed to simulate large amplitude, one-dimensional acoustic waves driven in closed ducts. Results from this portion of the investigation showed that gas-dynamic nonlinearities significantly influence the properties of driven oscillations by transferring acoustic energy from the fundamental driven mode into higher harmonic modes. The second portion of this investigation presented and analyzed results from a numerical model of wave engine dynamics based on the quasi one-dimensional conservation equations in addition to separate sub-models for mixing and heat release. This model was then used to perform parametric studies of the characteristics of mixing and engine shape. The objectives of these studies were to determine the influence of mixing characteristics and engine shape on overall wave engine performance and to develop insight into the physical processes controlling overall performance trends. Results from this model showed that wave engine performance was strongly dependent on the coupling between the unsteady heat release that drives oscillations in the engine and the characteristics that determine the acoustic properties of the engine such as engine shape and mean property gradients. Simulation results showed that average thrust generation decreased dramatically when the natural acoustic mode frequencies of the engine and the frequency content of the unsteady heat release were not aligned.

  4. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  5. Performance Steel Castings

    DTIC Science & Technology

    2012-09-30

    Development of Sand Properties 103 Advanced Modeling Dataset.. 105 High Strength Low Alloy (HSLA) Steels 107 Steel Casting and Engineering Support...to achieve the performance goals required for new systems. The dramatic reduction in weight and increase in capability will require high performance...for improved weapon system reliability. SFSA developed innovative casting design and manufacturing processes for high performance parts. SFSA is

  6. In-Drift Microbial Communities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Jolley

    2000-11-09

    As directed by written work direction (CRWMS M and O 1999f), Performance Assessment (PA) developed a model for microbial communities in the engineered barrier system (EBS) as documented here. The purpose of this model is to assist Performance Assessment and its Engineered Barrier Performance Section in modeling the geochemical environment within a potential repository drift for TSPA-SR/LA, thus allowing PA to provide a more detailed and complete near-field geochemical model and to answer the key technical issues (KTI) raised in the NRC Issue Resolution Status Report (IRSR) for the Evolution of the Near Field Environment (NFE) Revision 2 (NRC 1999).more » This model and its predecessor (the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document, CRWMS M and O 1998a) was developed to respond to the applicable KTIs. Additionally, because of the previous development of the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a), the M and O was effectively able to resolve a previous KTI concern regarding the effects of microbial processes on seepage and flow (NRC 1998). This document supercedes the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a). This document provides the conceptual framework of the revised in-drift microbial communities model to be used in subsequent performance assessment (PA) analyses.« less

  7. ASTP ranging system mathematical model

    NASA Technical Reports Server (NTRS)

    Ellis, M. R.; Robinson, L. H.

    1973-01-01

    A mathematical model is presented of the VHF ranging system to analyze the performance of the Apollo-Soyuz test project (ASTP). The system was adapted for use in the ASTP. The ranging system mathematical model is presented in block diagram form, and a brief description of the overall model is also included. A procedure for implementing the math model is presented along with a discussion of the validation of the math model and the overall summary and conclusions of the study effort. Detailed appendices of the five study tasks are presented: early late gate model development, unlock probability development, system error model development, probability of acquisition and model development, and math model validation testing.

  8. Application of multivariate Gaussian detection theory to known non-Gaussian probability density functions

    NASA Astrophysics Data System (ADS)

    Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.

    1995-06-01

    A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.

  9. Mathematical Model Development and Simulation Support

    NASA Technical Reports Server (NTRS)

    Francis, Ronald C.; Tobbe, Patrick A.

    2000-01-01

    This report summarizes the work performed in support of the Contact Dynamics 6DOF Facility and the Flight Robotics Lab at NASA/ MSFC in the areas of Mathematical Model Development and Simulation Support.

  10. Palm: Easing the Burden of Analytical Performance Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less

  11. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  12. Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle

    NASA Astrophysics Data System (ADS)

    Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong

    2017-02-01

    Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.

  13. What’s in a game? A systems approach to enhancing performance analysis in football

    PubMed Central

    2017-01-01

    Purpose Performance analysis (PA) in football is considered to be an integral component of understanding the requirements for optimal performance. Despite vast amounts of research in this area key gaps remain, including what comprises PA in football, and methods to minimise research-practitioner gaps. The aim of this study was to develop a model of the football match system in order to better describe and understand the components of football performance. Such a model could inform the design of new PA methods. Method Eight elite level football Subject Method Experts (SME’s) participated in two workshops to develop a systems model of the football match system. The model was developed using a first-of-its-kind application of Cognitive Work Analysis (CWA) in football. CWA has been used in many other non-sporting domains to analyse and understand complex systems. Result Using CWA, a model of the football match ‘system’ was developed. The model enabled identification of several PA measures not currently utilised, including communication between team members, adaptability of teams, playing at the appropriate tempo, as well as attacking and defending related measures. Conclusion The results indicate that football is characteristic of a complex sociotechnical system, and revealed potential new and unique PA measures regarded as important by SME’s, yet not currently measured. Importantly, these results have identified a gap between the current PA research and the information that is meaningful to football coaches and practitioners. PMID:28212392

  14. U1108 performance model

    NASA Technical Reports Server (NTRS)

    Trachta, G.

    1976-01-01

    A model of Univac 1108 work flow has been developed to assist in performance evaluation studies and configuration planning. Workload profiles and system configurations are parameterized for ease of experimental modification. Outputs include capacity estimates and performance evaluation functions. The U1108 system is conceptualized as a service network; classical queueing theory is used to evaluate network dynamics.

  15. Digital Astronaut Project Biomechanical Models: Biomechanical Modeling of Squat, Single-Leg Squat and Heel Raise Exercises on the Hybrid Ultimate Lifting Kit (HULK)

    NASA Technical Reports Server (NTRS)

    Thompson, William K.; Gallo, Christopher A.; Crentsil, Lawton; Lewandowski, Beth E.; Humphreys, Brad T.; DeWitt, John K.; Fincke, Renita S.; Mulugeta, Lealem

    2015-01-01

    The NASA Digital Astronaut Project (DAP) implements well-vetted computational models to predict and assess spaceflight health and performance risks, and to enhance countermeasure development. The DAP Musculoskeletal Modeling effort is developing computational models to inform exercise countermeasure development and to predict physical performance capabilities after a length of time in space. For example, integrated exercise device-biomechanical models can determine localized loading, which will be used as input to muscle and bone adaptation models to estimate the effectiveness of the exercise countermeasure. In addition, simulations of mission tasks can be used to estimate the astronaut's ability to perform the task after exposure to microgravity and after using various exercise countermeasures. The software package OpenSim (Stanford University, Palo Alto, CA) (Ref. 1) is being used to create the DAP biomechanical models and its built-in muscle model is the starting point for the DAP muscle model. During Exploration missions, such as those to asteroids and Mars, astronauts will be exposed to reduced gravity for extended periods. Therefore, the crew must have access to exercise countermeasures that can maintain their musculoskeletal and aerobic health. Exploration vehicles may have very limited volume and power available to accommodate such capabilities, even more so than the International Space Station (ISS). The exercise devices flown on Exploration missions must be designed to provide sufficient load during the performance of various resistance and aerobic/anaerobic exercises while meeting potential additional requirements of limited mass, volume and power. Given that it is not practical to manufacture and test (ground, analog and/or flight) all candidate devices, nor is it always possible to obtain data such as localized muscle and bone loading empirically, computational modeling can estimate the localized loading during various exercise modalities performed on a given device to help formulate exercise prescriptions and other operational considerations. With this in mind, NASA's Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) laboratory and NSBRI-funded researchers by developing and implementing well-validated computational models of exercises with advanced exercise device concepts. This report focuses specifically on lower-body resistance exercises performed with the Hybrid Ultimate Lifting Kit (HULK) device as a deliverable to the AEC Project.

  16. Labyrinth Seal Analysis. Volume 3. Analytical and Experimental Development of a Design Model for Labyrinth Seals

    DTIC Science & Technology

    1986-01-01

    the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests

  17. A spiral model of musical decision-making.

    PubMed

    Bangert, Daniel; Schubert, Emery; Fabian, Dorottya

    2014-01-01

    This paper describes a model of how musicians make decisions about performing notated music. The model builds on psychological theories of decision-making and was developed from empirical studies of Western art music performance that aimed to identify intuitive and deliberate processes of decision-making, a distinction consistent with dual-process theories of cognition. The model proposes that the proportion of intuitive (Type 1) and deliberate (Type 2) decision-making processes changes with increasing expertise and conceptualizes this change as movement along a continually narrowing upward spiral where the primary axis signifies principal decision-making type and the vertical axis marks level of expertise. The model is intended to have implications for the development of expertise as described in two main phases. The first is movement from a primarily intuitive approach in the early stages of learning toward greater deliberation as analytical techniques are applied during practice. The second phase occurs as deliberate decisions gradually become automatic (procedural), increasing the role of intuitive processes. As a performer examines more issues or reconsiders decisions, the spiral motion toward the deliberate side and back to the intuitive is repeated indefinitely. With increasing expertise, the spiral tightens to signify greater control over decision type selection. The model draws on existing theories, particularly Evans' (2011) Intervention Model of dual-process theories, Cognitive Continuum Theory Hammond et al. (1987), Hammond (2007), Baylor's (2001) U-shaped model for the development of intuition by level of expertise. By theorizing how musical decision-making operates over time and with increasing expertise, this model could be used as a framework for future research in music performance studies and performance science more generally.

  18. Waterhammer Transient Simulation and Model Anchoring for the Robotic Lunar Lander Propulsion System

    NASA Technical Reports Server (NTRS)

    Stein, William B.; Trinh, Huu P.; Reynolds, Michael E.; Sharp, David J.

    2011-01-01

    Waterhammer transients have the potential to adversely impact propulsion system design if not properly addressed. Waterhammer can potentially lead to system plumbing, and component damage. Multi-thruster propulsion systems also develop constructive/destructive wave interference which becomes difficult to predict without detailed models. Therefore, it is important to sufficiently characterize propulsion system waterhammer in order to develop a robust design with minimal impact to other systems. A risk reduction activity was performed at Marshall Space Flight Center to develop a tool for estimating waterhammer through the use of anchored simulation for the Robotic Lunar Lander (RLL) propulsion system design. Testing was performed to simulate waterhammer surges due to rapid valve closure and consisted of twenty-two series of waterhammer tests, resulting in more than 300 valve actuations. These tests were performed using different valve actuation schemes and three system pressures. Data from the valve characterization tests were used to anchor the models that employed MSCSoftware.EASY5 v.2010 to model transient fluid phenomena by using transient forms of mass and energy conservation. The anchoring process was performed by comparing initial model results to experimental data and then iterating the model input to match the simulation results with the experimental data. The models provide good correlation with experimental results, supporting the use of EASY5 as a tool to model fluid transients and provide a baseline for future RLL system modeling. This paper addresses tasks performed during the waterhammer risk reduction activity for the RLL propulsion system. The problem of waterhammer simulation anchoring as applied to the RLL system is discussed with results from the corresponding experimental valve tests. Important factors for waterhammer mitigation are discussed along with potential design impacts to the RLL propulsion system.

  19. A spiral model of musical decision-making

    PubMed Central

    Bangert, Daniel; Schubert, Emery; Fabian, Dorottya

    2014-01-01

    This paper describes a model of how musicians make decisions about performing notated music. The model builds on psychological theories of decision-making and was developed from empirical studies of Western art music performance that aimed to identify intuitive and deliberate processes of decision-making, a distinction consistent with dual-process theories of cognition. The model proposes that the proportion of intuitive (Type 1) and deliberate (Type 2) decision-making processes changes with increasing expertise and conceptualizes this change as movement along a continually narrowing upward spiral where the primary axis signifies principal decision-making type and the vertical axis marks level of expertise. The model is intended to have implications for the development of expertise as described in two main phases. The first is movement from a primarily intuitive approach in the early stages of learning toward greater deliberation as analytical techniques are applied during practice. The second phase occurs as deliberate decisions gradually become automatic (procedural), increasing the role of intuitive processes. As a performer examines more issues or reconsiders decisions, the spiral motion toward the deliberate side and back to the intuitive is repeated indefinitely. With increasing expertise, the spiral tightens to signify greater control over decision type selection. The model draws on existing theories, particularly Evans’ (2011) Intervention Model of dual-process theories, Cognitive Continuum Theory Hammond et al. (1987), Hammond (2007), Baylor’s (2001) U-shaped model for the development of intuition by level of expertise. By theorizing how musical decision-making operates over time and with increasing expertise, this model could be used as a framework for future research in music performance studies and performance science more generally. PMID:24795673

  20. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment

    PubMed Central

    2011-01-01

    Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025

  1. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment.

    PubMed

    Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott

    2011-07-28

    Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.

  2. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  3. Ski jump takeoff performance predictions for a mixed-flow, remote-lift STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Birckelbaw, Lourdes G.

    1992-01-01

    A ski jump model was developed to predict ski jump takeoff performance for a short takeoff and vertical landing (STOVL) aircraft. The objective was to verify the model with results from a piloted simulation of a mixed flow, remote lift STOVL aircraft. The prediction model is discussed. The predicted results are compared with the piloted simulation results. The ski jump model can be utilized for basic research of other thrust vectoring STOVL aircraft performing a ski jump takeoff.

  4. Development of an inorganic and organic aerosol model (CHIMERE 2017β v1.0): seasonal and spatial evaluation over Europe

    NASA Astrophysics Data System (ADS)

    Couvidat, Florian; Bessagnet, Bertrand; Garcia-Vivanco, Marta; Real, Elsa; Menut, Laurent; Colette, Augustin

    2018-01-01

    A new aerosol module was developed and integrated in the air quality model CHIMERE. Developments include the use of the Model of Emissions and Gases and Aerosols from Nature (MEGAN) 2.1 for biogenic emissions, the implementation of the inorganic thermodynamic model ISORROPIA 2.1, revision of wet deposition processes and of the algorithms of condensation/evaporation and coagulation and the implementation of the secondary organic aerosol (SOA) mechanism H2O and the thermodynamic model SOAP. Concentrations of particles over Europe were simulated by the model for the year 2013. Model concentrations were compared to the European Monitoring and Evaluation Programme (EMEP) observations and other observations available in the EBAS database to evaluate the performance of the model. Performances were determined for several components of particles (sea salt, sulfate, ammonium, nitrate, organic aerosol) with a seasonal and regional analysis of results. The model gives satisfactory performance in general. For sea salt, the model succeeds in reproducing the seasonal evolution of concentrations for western and central Europe. For sulfate, except for an overestimation of sulfate in northern Europe, modeled concentrations are close to observations and the model succeeds in reproducing the seasonal evolution of concentrations. For organic aerosol, the model reproduces with satisfactory results concentrations for stations with strong modeled biogenic SOA concentrations. However, the model strongly overestimates ammonium nitrate concentrations during late autumn (possibly due to problems in the temporal evolution of emissions) and strongly underestimates summer organic aerosol concentrations over most of the stations (especially in the northern half of Europe). This underestimation could be due to a lack of anthropogenic SOA or biogenic emissions in northern Europe. A list of recommended tests and developments to improve the model is also given.

  5. Simulation of car movement along circular path

    NASA Astrophysics Data System (ADS)

    Fedotov, A. I.; Tikhov-Tinnikov, D. A.; Ovchinnikova, N. I.; Lysenko, A. V.

    2017-10-01

    Under operating conditions, suspension system performance changes which negatively affects vehicle stability and handling. The paper aims to simulate the impact of changes in suspension system performance on vehicle stability and handling. Methods. The paper describes monitoring of suspension system performance, testing of vehicle stability and handling, analyzes methods of suspension system performance monitoring under operating conditions. The mathematical model of a car movement along a circular path was developed. Mathematical tools describing a circular movement of a vehicle along a horizontal road were developed. Turning car movements were simulated. Calculation and experiment results were compared. Simulation proves the applicability of a mathematical model for assessment of the impact of suspension system performance on vehicle stability and handling.

  6. Terrestrial Planet Finder Coronagraph Optical Modeling

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.; Redding, David C.

    2004-01-01

    The Terrestrial Planet Finder Coronagraph will rely heavily on modeling and analysis throughout its mission lifecycle. Optical modeling is especially important, since the tolerances on the optics as well as scattered light suppression are critical for the mission's success. The high contrast imaging necessary to observe a planet orbiting a distant star requires new and innovative technologies to be developed and tested, and detailed optical modeling provides predictions for evaluating design decisions. It also provides a means to develop and test algorithms designed to actively suppress scattered light via deformable mirrors and other techniques. The optical models are used in conjunction with structural and thermal models to create fully integrated optical/structural/thermal models that are used to evaluate dynamic effects of disturbances on the overall performance of the coronagraph. The optical models we have developed have been verified on the High Contrast Imaging Testbed. Results of the optical modeling verification and the methods used to perform full three-dimensional near-field diffraction analysis are presented.

  7. Integrating the Advanced Human Eye Model (AHEM) and optical instrument models to model complete visual optical systems inclusive of the typical or atypical eye

    NASA Astrophysics Data System (ADS)

    Donnelly, William J., III

    2012-06-01

    PURPOSE: To present a commercially available optical modeling software tool to assist the development of optical instrumentation and systems that utilize and/or integrate with the human eye. METHODS: A commercially available flexible eye modeling system is presented, the Advanced Human Eye Model (AHEM). AHEM is a module that the engineer can use to perform rapid development and test scenarios on systems that integrate with the eye. Methods include merging modeled systems initially developed outside of AHEM and performing a series of wizard-type operations that relieve the user from requiring an optometric or ophthalmic background to produce a complete eye inclusive system. Scenarios consist of retinal imaging of targets and sources through integrated systems. Uses include, but are not limited to, optimization, telescopes, microscopes, spectacles, contact and intraocular lenses, ocular aberrations, cataract simulation and scattering, and twin eye model (binocular) systems. RESULTS: Metrics, graphical data, and exportable CAD geometry are generated from the various modeling scenarios.

  8. Improving Risk Adjustment for Mortality After Pediatric Cardiac Surgery: The UK PRAiS2 Model.

    PubMed

    Rogers, Libby; Brown, Katherine L; Franklin, Rodney C; Ambler, Gareth; Anderson, David; Barron, David J; Crowe, Sonya; English, Kate; Stickley, John; Tibby, Shane; Tsang, Victor; Utley, Martin; Witter, Thomas; Pagel, Christina

    2017-07-01

    Partial Risk Adjustment in Surgery (PRAiS), a risk model for 30-day mortality after children's heart surgery, has been used by the UK National Congenital Heart Disease Audit to report expected risk-adjusted survival since 2013. This study aimed to improve the model by incorporating additional comorbidity and diagnostic information. The model development dataset was all procedures performed between 2009 and 2014 in all UK and Ireland congenital cardiac centers. The outcome measure was death within each 30-day surgical episode. Model development followed an iterative process of clinical discussion and development and assessment of models using logistic regression under 25 × 5 cross-validation. Performance was measured using Akaike information criterion, the area under the receiver-operating characteristic curve (AUC), and calibration. The final model was assessed in an external 2014 to 2015 validation dataset. The development dataset comprised 21,838 30-day surgical episodes, with 539 deaths (mortality, 2.5%). The validation dataset comprised 4,207 episodes, with 97 deaths (mortality, 2.3%). The updated risk model included 15 procedural, 11 diagnostic, and 4 comorbidity groupings, and nonlinear functions of age and weight. Performance under cross-validation was: median AUC of 0.83 (range, 0.82 to 0.83), median calibration slope and intercept of 0.92 (range, 0.64 to 1.25) and -0.23 (range, -1.08 to 0.85) respectively. In the validation dataset, the AUC was 0.86 (95% confidence interval [CI], 0.82 to 0.89), and the calibration slope and intercept were 1.01 (95% CI, 0.83 to 1.18) and 0.11 (95% CI, -0.45 to 0.67), respectively, showing excellent performance. A more sophisticated PRAiS2 risk model for UK use was developed with additional comorbidity and diagnostic information, alongside age and weight as nonlinear variables. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Program management model study

    NASA Technical Reports Server (NTRS)

    Connelly, J. J.; Russell, J. E.; Seline, J. R.; Sumner, N. R., Jr.

    1972-01-01

    Two models, a system performance model and a program assessment model, have been developed to assist NASA management in the evaluation of development alternatives for the Earth Observations Program. Two computer models were developed and demonstrated on the Goddard Space Flight Center Computer Facility. Procedures have been outlined to guide the user of the models through specific evaluation processes, and the preparation of inputs describing earth observation needs and earth observation technology. These models are intended to assist NASA in increasing the effectiveness of the overall Earth Observation Program by providing a broader view of system and program development alternatives.

  10. Seasonal precipitation forecasting for the Melbourne region using a Self-Organizing Maps approach

    NASA Astrophysics Data System (ADS)

    Pidoto, Ross; Wallner, Markus; Haberlandt, Uwe

    2017-04-01

    The Melbourne region experiences highly variable inter-annual rainfall. For close to a decade during the 2000s, below average rainfall seriously affected the environment, water supplies and agriculture. A seasonal rainfall forecasting model for the Melbourne region based on the novel approach of a Self-Organizing Map has been developed and tested for its prediction performance. Predictor variables at varying lead times were first assessed for inclusion within the model by calculating their importance via Random Forests. Predictor variables tested include the climate indices SOI, DMI and N3.4, in addition to gridded global sea surface temperature data. Five forecasting models were developed: an annual model and four seasonal models, each individually optimized for performance through Pearson's correlation r and the Nash-Sutcliffe Efficiency. The annual model showed a prediction performance of r = 0.54 and NSE = 0.14. The best seasonal model was for spring, with r = 0.61 and NSE = 0.31. Autumn was the worst performing seasonal model. The sea surface temperature data contributed fewer predictor variables compared to climate indices. Most predictor variables were supplied at a minimum lead, however some predictors were found at lead times of up to a year.

  11. TOD to TTP calibration

    NASA Astrophysics Data System (ADS)

    Bijl, Piet; Reynolds, Joseph P.; Vos, Wouter K.; Hogervorst, Maarten A.; Fanning, Jonathan D.

    2011-05-01

    The TTP (Targeting Task Performance) metric, developed at NVESD, is the current standard US Army model to predict EO/IR Target Acquisition performance. This model however does not have a corresponding lab or field test to empirically assess the performance of a camera system. The TOD (Triangle Orientation Discrimination) method, developed at TNO in The Netherlands, provides such a measurement. In this study, we make a direct comparison between TOD performance for a range of sensors and the extensive historical US observer performance database built to develop and calibrate the TTP metric. The US perception data were collected doing an identification task by military personnel on a standard 12 target, 12 aspect tactical vehicle image set that was processed through simulated sensors for which the most fundamental sensor parameters such as blur, sampling, spatial and temporal noise were varied. In the present study, we measured TOD sensor performance using exactly the same sensors processing a set of TOD triangle test patterns. The study shows that good overall agreement is obtained when the ratio between target characteristic size and TOD test pattern size at threshold equals 6.3. Note that this number is purely based on empirical data without any intermediate modeling. The calibration of the TOD to the TTP is highly beneficial to the sensor modeling and testing community for a variety of reasons. These include: i) a connection between requirement specification and acceptance testing, and ii) a very efficient method to quickly validate or extend the TTP range prediction model to new systems and tasks.

  12. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  13. Summary of the key features of seven biomathematical models of human fatigue and performance.

    PubMed

    Mallis, Melissa M; Mejdal, Sig; Nguyen, Tammy T; Dinges, David F

    2004-03-01

    Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbély, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.

  14. Summary of the key features of seven biomathematical models of human fatigue and performance

    NASA Technical Reports Server (NTRS)

    Mallis, Melissa M.; Mejdal, Sig; Nguyen, Tammy T.; Dinges, David F.

    2004-01-01

    BACKGROUND: Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. METHODS: An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. RESULTS: Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. CONCLUSIONS: Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbely, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.

  15. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  16. Development of an Open Rotor Cycle Model in NPSS Using a Multi-Design Point Approach

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.

    2011-01-01

    NASA's Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft (Refs. 1 and 2). The open rotor concept (also referred to as the Unducted Fan or advanced turboprop) may allow the achievement of this objective by reducing engine emissions and fuel consumption. To evaluate its potential impact, an open rotor cycle modeling capability is needed. This paper presents the initial development of an open rotor cycle model in the Numerical Propulsion System Simulation (NPSS) computer program which can then be used to evaluate the potential benefit of this engine. The development of this open rotor model necessitated addressing two modeling needs within NPSS. First, a method for evaluating the performance of counter-rotating propellers was needed. Therefore, a new counter-rotating propeller NPSS component was created. This component uses propeller performance maps developed from historic counter-rotating propeller experiments to determine the thrust delivered and power required. Second, several methods for modeling a counter-rotating power turbine within NPSS were explored. These techniques used several combinations of turbine components within NPSS to provide the necessary power to the propellers. Ultimately, a single turbine component with a conventional turbine map was selected. Using these modeling enhancements, an open rotor cycle model was developed in NPSS using a multi-design point approach. The multi-design point (MDP) approach improves the engine cycle analysis process by making it easier to properly size the engine to meet a variety of thrust targets throughout the flight envelope. A number of design points are considered including an aerodynamic design point, sea-level static, takeoff and top of climb. The development of this MDP model was also enabled by the selection of a simple power management scheme which schedules propeller blade angles with the freestream Mach number. Finally, sample open rotor performance results and areas for further model improvements are presented.

  17. Developing Information Power Grid Based Algorithms and Software

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack

    1998-01-01

    This exploratory study initiated our effort to understand performance modeling on parallel systems. The basic goal of performance modeling is to understand and predict the performance of a computer program or set of programs on a computer system. Performance modeling has numerous applications, including evaluation of algorithms, optimization of code implementations, parallel library development, comparison of system architectures, parallel system design, and procurement of new systems. Our work lays the basis for the construction of parallel libraries that allow for the reconstruction of application codes on several distinct architectures so as to assure performance portability. Following our strategy, once the requirements of applications are well understood, one can then construct a library in a layered fashion. The top level of this library will consist of architecture-independent geometric, numerical, and symbolic algorithms that are needed by the sample of applications. These routines should be written in a language that is portable across the targeted architectures.

  18. Task scheduling in dataflow computer architectures

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  20. Monitoring the performance of the next Climate Forecast System version 3, throughout its development stage at EMC/NCEP

    NASA Astrophysics Data System (ADS)

    Peña, M.; Saha, S.; Wu, X.; Wang, J.; Tripp, P.; Moorthi, S.; Bhattacharjee, P.

    2016-12-01

    The next version of the operational Climate Forecast System (version 3, CFSv3) will be a fully coupled six-components system with diverse applications to earth system modeling, including weather and climate predictions. This system will couple the earth's atmosphere, land, ocean, sea-ice, waves and aerosols for both data assimilation and modeling. It will also use the NOAA Environmental Modeling System (NEMS) software super structure to couple these components. The CFSv3 is part of the next Unified Global Coupled System (UGCS), which will unify the global prediction systems that are now operational at NCEP. The UGCS is being developed through the efforts of dedicated research and engineering teams and through coordination across many CPO/MAPP and NGGPS groups. During this development phase, the UGCS is being tested for seasonal purposes and undergoes frequent revisions. Each new revision is evaluated to quickly discover, isolate and solve problems that negatively impact its performance. In the UGCS-seasonal model, components (e.g., ocean, sea-ice, atmosphere, etc.) are coupled through a NEMS-based "mediator". In this numerical infrastructure, model diagnostics and forecast validation are carried out, both component by component, and as a whole. The next stage, model optimization, will require enhanced performance diagnostics tools to help prioritize areas of numerical improvements. After the technical development of the UGCS-seasonal is completed, it will become the first realization of the CFSv3. All future development of this system will be carried out by the climate team at NCEP, in scientific collaboration with the groups that developed the individual components, as well as the climate community. A unique challenge to evaluate this unified weather-climate system is the large number of variables, which evolve over a wide range of temporal and spatial scales. A small set of performance measures and scorecard displays are been created, and collaboration and software contributions from research and operational centers are being incorporated. A status of the CFSv3/UGCS-seasonal development and examples of its performance and measuring tools will be presented.

  1. Telerobotic system performance measurement - Motivation and methods

    NASA Technical Reports Server (NTRS)

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  2. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    NASA Astrophysics Data System (ADS)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  3. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE PAGES

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    2016-09-01

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  4. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  5. Real-Time Simulation of the X-33 Aerospace Engine

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert

    1999-01-01

    This paper discusses the development and performance of the X-33 Aerospike Engine RealTime Model. This model was developed for the purposes of control law development, six degree-of-freedom trajectory analysis, vehicle system integration testing, and hardware-in-the loop controller verification. The Real-Time Model uses time-step marching solution of non-linear differential equations representing the physical processes involved in the operation of a liquid propellant rocket engine, albeit in a simplified form. These processes include heat transfer, fluid dynamics, combustion, and turbomachine performance. Two engine models are typically employed in order to accurately model maneuvering and the powerpack-out condition where the power section of one engine is used to supply propellants to both engines if one engine malfunctions. The X-33 Real-Time Model is compared to actual hot fire test data and is been found to be in good agreement.

  6. An Analysis of the Relationship between the Organizational Culture and the Performance of Staff Work Groups in Schools and the Development of an Explanatory Model

    ERIC Educational Resources Information Center

    James, Chris; Connolly, Michael

    2009-01-01

    This article analyses the concept of organizational culture and the relationship between the organizational culture and the performance of staff work groups in schools. The article draws upon a study of 12 schools in Wales, UK, which despite being in disadvantaged settings have high levels of pupil attainment. A model is developed linking the…

  7. Electrochemical carbon dioxide concentrator: Math model

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Schubert, F. H.; Carlson, J. N.

    1973-01-01

    A steady state computer simulation model of an Electrochemical Depolarized Carbon Dioxide Concentrator (EDC) has been developed. The mathematical model combines EDC heat and mass balance equations with empirical correlations derived from experimental data to describe EDC performance as a function of the operating parameters involved. The model is capable of accurately predicting performance over EDC operating ranges. Model simulation results agree with the experimental data obtained over the prediction range.

  8. Closed-form solutions of performability. [modeling of a degradable buffer/multiprocessor system

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1981-01-01

    Methods which yield closed form performability solutions for continuous valued variables are developed. The models are similar to those employed in performance modeling (i.e., Markovian queueing models) but are extended so as to account for variations in structure due to faults. In particular, the modeling of a degradable buffer/multiprocessor system is considered whose performance Y is the (normalized) average throughput rate realized during a bounded interval of time. To avoid known difficulties associated with exact transient solutions, an approximate decomposition of the model is employed permitting certain submodels to be solved in equilibrium. These solutions are then incorporated in a model with fewer transient states and by solving the latter, a closed form solution of the system's performability is obtained. In conclusion, some applications of this solution are discussed and illustrated, including an example of design optimization.

  9. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  10. The European Thoracic Surgery Database project: modelling the risk of in-hospital death following lung resection.

    PubMed

    Berrisford, Richard; Brunelli, Alessandro; Rocco, Gaetano; Treasure, Tom; Utley, Martin

    2005-08-01

    To identify pre-operative factors associated with in-hospital mortality following lung resection and to construct a risk model that could be used prospectively to inform decisions and retrospectively to enable fair comparisons of outcomes. Data were submitted to the European Thoracic Surgery Database from 27 units in 14 countries. We analysed data concerning all patients that had a lung resection. Logistic regression was used with a random sample of 60% of cases to identify pre-operative factors associated with in-hospital mortality and to build a model of risk. The resulting model was tested on the remaining 40% of patients. A second model based on age and ppoFEV1% was developed for risk of in-hospital death amongst tumour resection patients. Of the 3426 adult patients that had a first lung resection for whom mortality data were available, 66 died within the same hospital admission. Within the data used for model development, dyspnoea (according to the Medical Research Council classification), ASA (American Society of Anaesthesiologists) score, class of procedure and age were found to be significantly associated with in-hospital death in a multivariate analysis. The logistic model developed on these data displayed predictive value when tested on the remaining data. Two models of the risk of in-hospital death amongst adult patients undergoing lung resection have been developed. The models show predictive value and can be used to discern between high-risk and low-risk patients. Amongst the test data, the model developed for all diagnoses performed well at low risk, underestimated mortality at medium risk and overestimated mortality at high risk. The second model for resection of lung neoplasms was developed after establishing the performance of the first model and so could not be tested robustly. That said, we were encouraged by its performance over the entire range of estimated risk. The first of these two models could be regarded as an evaluation based on clinically available criteria while the second uses data obtained from objective measurement. We are optimistic that further model development and testing will provide a tool suitable for case mix adjustment.

  11. Functional structure and dynamics of the human nervous system

    NASA Technical Reports Server (NTRS)

    Lawrence, J. A.

    1981-01-01

    The status of an effort to define the directions needed to take in extending pilot models is reported. These models are needed to perform closed-loop (man-in-the-loop) feedback flight control system designs and to develop cockpit display requirements. The approach taken is to develop a hypothetical working model of the human nervous system by reviewing the current literature in neurology and psychology and to develop a computer model of this hypothetical working model.

  12. Development of an in Silico Model of DPPH• Free Radical Scavenging Capacity: Prediction of Antioxidant Activity of Coumarin Type Compounds.

    PubMed

    Goya Jorge, Elizabeth; Rayar, Anita Maria; Barigye, Stephen J; Jorge Rodríguez, María Elisa; Sylla-Iyarreta Veitía, Maité

    2016-06-07

    A quantitative structure-activity relationship (QSAR) study of the 2,2-diphenyl-l-picrylhydrazyl (DPPH•) radical scavenging ability of 1373 chemical compounds, using DRAGON molecular descriptors (MD) and the neural network technique, a technique based on the multilayer multilayer perceptron (MLP), was developed. The built model demonstrated a satisfactory performance for the training ( R 2 = 0.713 ) and test set ( Q ext 2 = 0.654 ) , respectively. To gain greater insight on the relevance of the MD contained in the MLP model, sensitivity and principal component analyses were performed. Moreover, structural and mechanistic interpretation was carried out to comprehend the relationship of the variables in the model with the modeled property. The constructed MLP model was employed to predict the radical scavenging ability for a group of coumarin-type compounds. Finally, in order to validate the model's predictions, an in vitro assay for one of the compounds (4-hydroxycoumarin) was performed, showing a satisfactory proximity between the experimental and predicted pIC50 values.

  13. An automatic and effective parameter optimization method for model tuning

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.

    2015-05-01

    Physical parameterizations in General Circulation Models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determines parameter sensitivity and the other chooses the optimum initial value of sensitive parameters, are introduced before the downhill simplex method to reduce the computational cost and improve the tuning performance. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9%. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameters tuning during the model development stage.

  14. Developing a physiologically based approach for modeling plutonium decorporation therapy with DTPA.

    PubMed

    Kastl, Manuel; Giussani, Augusto; Blanchardon, Eric; Breustedt, Bastian; Fritsch, Paul; Hoeschen, Christoph; Lopez, Maria Antonia

    2014-11-01

    To develop a physiologically based compartmental approach for modeling plutonium decorporation therapy with the chelating agent Diethylenetriaminepentaacetic acid (Ca-DTPA/Zn-DTPA). Model calculations were performed using the software package SAAM II (©The Epsilon Group, Charlottesville, Virginia, USA). The Luciani/Polig compartmental model with age-dependent description of the bone recycling processes was used for the biokinetics of plutonium. The Luciani/Polig model was slightly modified in order to account for the speciation of plutonium in blood and for the different affinities for DTPA of the present chemical species. The introduction of two separate blood compartments, describing low-molecular-weight complexes of plutonium (Pu-LW) and transferrin-bound plutonium (Pu-Tf), respectively, and one additional compartment describing plutonium in the interstitial fluids was performed successfully. The next step of the work is the modeling of the chelation process, coupling the physiologically modified structure with the biokinetic model for DTPA. RESULTS of animal studies performed under controlled conditions will enable to better understand the principles of the involved mechanisms.

  15. Neural network models for biological waste-gas treatment systems.

    PubMed

    Rene, Eldon R; Estefanía López, M; Veiga, María C; Kennes, Christian

    2011-12-15

    This paper outlines the procedure for developing artificial neural network (ANN) based models for three bioreactor configurations used for waste-gas treatment. The three bioreactor configurations chosen for this modelling work were: biofilter (BF), continuous stirred tank bioreactor (CSTB) and monolith bioreactor (MB). Using styrene as the model pollutant, this paper also serves as a general database of information pertaining to the bioreactor operation and important factors affecting gas-phase styrene removal in these biological systems. Biological waste-gas treatment systems are considered to be both advantageous and economically effective in treating a stream of polluted air containing low to moderate concentrations of the target contaminant, over a rather wide range of gas-flow rates. The bioreactors were inoculated with the fungus Sporothrix variecibatus, and their performances were evaluated at different empty bed residence times (EBRT), and at different inlet styrene concentrations (C(i)). The experimental data from these bioreactors were modelled to predict the bioreactors performance in terms of their removal efficiency (RE, %), by adequate training and testing of a three-layered back propagation neural network (input layer-hidden layer-output layer). Two models (BIOF1 and BIOF2) were developed for the BF with different combinations of easily measurable BF parameters as the inputs, that is concentration (gm(-3)), unit flow (h(-1)) and pressure drop (cm of H(2)O). The model developed for the CSTB used two inputs (concentration and unit flow), while the model for the MB had three inputs (concentration, G/L (gas/liquid) ratio, and pressure drop). Sensitivity analysis in the form of absolute average sensitivity (AAS) was performed for all the developed ANN models to ascertain the importance of the different input parameters, and to assess their direct effect on the bioreactors performance. The performance of the models was estimated by the regression coefficient values (R(2)) for the test data set. The results obtained from this modelling work can be useful for obtaining important relationships between different bioreactor parameters and for estimating their safe operating regimes. Copyright © 2011. Published by Elsevier B.V.

  16. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture mechanics analysis. The goal of these predictions was to provide additional information to guide decisions on the potential of reusing existing and installed units prior to the new design certification.

  17. Determinant Factors of Long-Term Performance Development in Young Swimmers.

    PubMed

    Morais, Jorge E; Silva, António J; Marinho, Daniel A; Lopes, Vítor P; Barbosa, Tiago M

    2017-02-01

    To develop a performance predictor model based on swimmers' biomechanical profile, relate the partial contribution of the main predictors with the training program, and analyze the time effect, sex effect, and time × sex interaction. 91 swimmers (44 boys, 12.04 ± 0.81 y; 47 girls, 11.22 ± 0.98 y) evaluated during a 3-y period. The decimal age and anthropometric, kinematic, and efficiency features were collected 10 different times over 3 seasons (ie, longitudinal research). Hierarchical linear modeling was the procedure used to estimate the performance predictors. Performance improved between season 1 early and season 3 late for both sexes (boys 26.9% [20.88;32.96], girls 16.1% [10.34;22.54]). Decimal age (estimate [EST] -2.05, P < .001), arm span (EST -0.59, P < .001), stroke length (EST 3.82; P = .002), and propelling efficiency (EST -0.17, P = .001) were entered in the final model. Over 3 consecutive seasons young swimmers' performance improved. Performance is a multifactorial phenomenon where anthropometrics, kinematics, and efficiency were the main determinants. The change of these factors over time was coupled with the training plans of this talent identification and development program.

  18. Prediction of risk of recurrence of venous thromboembolism following treatment for a first unprovoked venous thromboembolism: systematic review, prognostic model and clinical decision rule, and economic evaluation.

    PubMed

    Ensor, Joie; Riley, Richard D; Jowett, Sue; Monahan, Mark; Snell, Kym Ie; Bayliss, Susan; Moore, David; Fitzmaurice, David

    2016-02-01

    Unprovoked first venous thromboembolism (VTE) is defined as VTE in the absence of a temporary provoking factor such as surgery, immobility and other temporary factors. Recurrent VTE in unprovoked patients is highly prevalent, but easily preventable with oral anticoagulant (OAC) therapy. The unprovoked population is highly heterogeneous in terms of risk of recurrent VTE. The first aim of the project is to review existing prognostic models which stratify individuals by their recurrence risk, therefore potentially allowing tailored treatment strategies. The second aim is to enhance the existing research in this field, by developing and externally validating a new prognostic model for individual risk prediction, using a pooled database containing individual patient data (IPD) from several studies. The final aim is to assess the economic cost-effectiveness of the proposed prognostic model if it is used as a decision rule for resuming OAC therapy, compared with current standard treatment strategies. Standard systematic review methodology was used to identify relevant prognostic model development, validation and cost-effectiveness studies. Bibliographic databases (including MEDLINE, EMBASE and The Cochrane Library) were searched using terms relating to the clinical area and prognosis. Reviewing was undertaken by two reviewers independently using pre-defined criteria. Included full-text articles were data extracted and quality assessed. Critical appraisal of included full texts was undertaken and comparisons made of model performance. A prognostic model was developed using IPD from the pooled database of seven trials. A novel internal-external cross-validation (IECV) approach was used to develop and validate a prognostic model, with external validation undertaken in each of the trials iteratively. Given good performance in the IECV approach, a final model was developed using all trials data. A Markov patient-level simulation was used to consider the economic cost-effectiveness of using a decision rule (based on the prognostic model) to decide on resumption of OAC therapy (or not). Three full-text articles were identified by the systematic review. Critical appraisal identified methodological and applicability issues; in particular, all three existing models did not have external validation. To address this, new prognostic models were sought with external validation. Two potential models were considered: one for use at cessation of therapy (pre D-dimer), and one for use after cessation of therapy (post D-dimer). Model performance measured in the external validation trials showed strong calibration performance for both models. The post D-dimer model performed substantially better in terms of discrimination (c = 0.69), better separating high- and low-risk patients. The economic evaluation identified that a decision rule based on the final post D-dimer model may be cost-effective for patients with predicted risk of recurrence of over 8% annually; this suggests continued therapy for patients with predicted risks ≥ 8% and cessation of therapy otherwise. The post D-dimer model performed strongly and could be useful to predict individuals' risk of recurrence at any time up to 2-3 years, thereby aiding patient counselling and treatment decisions. A decision rule using this model may be cost-effective for informing clinical judgement and patient opinion in treatment decisions. Further research may investigate new predictors to enhance model performance and aim to further externally validate to confirm performance in new, non-trial populations. Finally, it is essential that further research is conducted to develop a model predicting bleeding risk on therapy, to manage the balance between the risks of recurrence and bleeding. This study is registered as PROSPERO CRD42013003494. The National Institute for Health Research Health Technology Assessment programme.

  19. Performance of fire behavior fuel models developed for the Rothermel Surface Fire Spread Model

    Treesearch

    Robert Ziel; W. Matt Jolly

    2009-01-01

    In 2005, 40 new fire behavior fuel models were published for use with the Rothermel Surface Fire Spread Model. These new models are intended to augment the original 13 developed in 1972 and 1976. As a compiled set of quantitative fuel descriptions that serve as input to the Rothermel model, the selected fire behavior fuel model has always been critical to the resulting...

  20. Proposed evaluation framework for assessing operator performance with multisensor displays

    NASA Technical Reports Server (NTRS)

    Foyle, David C.

    1992-01-01

    Despite aggressive work on the development of sensor fusion algorithms and techniques, no formal evaluation procedures have been proposed. Based on existing integration models in the literature, an evaluation framework is developed to assess an operator's ability to use multisensor, or sensor fusion, displays. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The operator's performance with the sensor fusion display can be compared to the models' predictions based on the operator's performance when viewing the original sensor displays prior to fusion. This allows for the determination as to when a sensor fusion system leads to: 1) poorer performance than one of the original sensor displays (clearly an undesirable system in which the fused sensor system causes some distortion or interference); 2) better performance than with either single sensor system alone, but at a sub-optimal (compared to the model predictions) level; 3) optimal performance (compared to model predictions); or, 4) super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays. An experiment demonstrating the usefulness of the proposed evaluation framework is discussed.

  1. Developing and Testing a Model to Predict Outcomes of Organizational Change

    PubMed Central

    Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold

    2003-01-01

    Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571

  2. ILS Glide Slope Performance Prediction Multipath Scattering

    DOT National Transportation Integrated Search

    1976-12-01

    A mathematical model has been developed which predicts the performance of ILS glide slope systems subject to multipath scattering and the effects of irregular terrain contours. The model is discussed in detail and then applied to a test case for purp...

  3. Effect of Using Extreme Years in Hydrologic Model Calibration Performance

    NASA Astrophysics Data System (ADS)

    Goktas, R. K.; Tezel, U.; Kargi, P. G.; Ayvaz, T.; Tezyapar, I.; Mesta, B.; Kentel, E.

    2017-12-01

    Hydrological models are useful in predicting and developing management strategies for controlling the system behaviour. Specifically they can be used for evaluating streamflow at ungaged catchments, effect of climate change, best management practices on water resources, or identification of pollution sources in a watershed. This study is a part of a TUBITAK project named "Development of a geographical information system based decision-making tool for water quality management of Ergene Watershed using pollutant fingerprints". Within the scope of this project, first water resources in Ergene Watershed is studied. Streamgages found in the basin are identified and daily streamflow measurements are obtained from State Hydraulic Works of Turkey. Streamflow data is analysed using box-whisker plots, hydrographs and flow-duration curves focusing on identification of extreme periods, dry or wet. Then a hydrological model is developed for Ergene Watershed using HEC-HMS in the Watershed Modeling System (WMS) environment. The model is calibrated for various time periods including dry and wet ones and the performance of calibration is evaluated using Nash-Sutcliffe Efficiency (NSE), correlation coefficient, percent bias (PBIAS) and root mean square error. It is observed that calibration period affects the model performance, and the main purpose of the development of the hydrological model should guide calibration period selection. Acknowledgement: This study is funded by The Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 115Y064.

  4. Scattering effects on the performance of carbon nanotube field effect transistor in a compact model

    NASA Astrophysics Data System (ADS)

    Hamieh, S. D.; Desgreys, P.; Naviner, J. F.

    2010-01-01

    Carbon nanotube field-effect transistors (CNTFET) are being extensively studied as possible successors to CMOS. Device simulators have been developed to estimate their performance in sub-10-nm and device structures have been fabricated. In this work, a new compact model of single-walled semiconducting CNTFET is proposed implementing the calculation of energy conduction sub-band minima and the treatment of scattering effects through energy shift in CNTFET. The developed model has been used to simulate I-V characteristics using VHDL-AMS simulator.

  5. NDRAM: nonlinear dynamic recurrent associative memory for learning bipolar and nonbipolar correlated patterns.

    PubMed

    Chartier, Sylvain; Proulx, Robert

    2005-11-01

    This paper presents a new unsupervised attractor neural network, which, contrary to optimal linear associative memory models, is able to develop nonbipolar attractors as well as bipolar attractors. Moreover, the model is able to develop less spurious attractors and has a better recall performance under random noise than any other Hopfield type neural network. Those performances are obtained by a simple Hebbian/anti-Hebbian online learning rule that directly incorporates feedback from a specific nonlinear transmission rule. Several computer simulations show the model's distinguishing properties.

  6. Incorporation of RAM techniques into simulation modeling

    NASA Astrophysics Data System (ADS)

    Nelson, S. C., Jr.; Haire, M. J.; Schryver, J. C.

    1995-01-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model to represent the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army's next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through 'what if' questions, sensitivity studies, and battle scenario changes.

  7. Quasi 1D Modeling of Mixed Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Woolwine, Kyle J.

    2012-01-01

    The AeroServoElasticity task under the NASA Supersonics Project is developing dynamic models of the propulsion system and the vehicle in order to conduct research for integrated vehicle dynamic performance. As part of this effort, a nonlinear quasi 1-dimensional model of the 2-dimensional bifurcated mixed compression supersonic inlet is being developed. The model utilizes computational fluid dynamics for both the supersonic and subsonic diffusers. The oblique shocks are modeled utilizing compressible flow equations. This model also implements variable geometry required to control the normal shock position. The model is flexible and can also be utilized to simulate other mixed compression supersonic inlet designs. The model was validated both in time and in the frequency domain against the legacy LArge Perturbation INlet code, which has been previously verified using test data. This legacy code written in FORTRAN is quite extensive and complex in terms of the amount of software and number of subroutines. Further, the legacy code is not suitable for closed loop feedback controls design, and the simulation environment is not amenable to systems integration. Therefore, a solution is to develop an innovative, more simplified, mixed compression inlet model with the same steady state and dynamic performance as the legacy code that also can be used for controls design. The new nonlinear dynamic model is implemented in MATLAB Simulink. This environment allows easier development of linear models for controls design for shock positioning. The new model is also well suited for integration with a propulsion system model to study inlet/propulsion system performance, and integration with an aero-servo-elastic system model to study integrated vehicle ride quality, vehicle stability, and efficiency.

  8. Evaluation of computing systems using functionals of a Stochastic process

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Wu, L. T.

    1980-01-01

    An intermediate model was used to represent the probabilistic nature of a total system at a level which is higher than the base model and thus closer to the performance variable. A class of intermediate models, which are generally referred to as functionals of a Markov process, were considered. A closed form solution of performability for the case where performance is identified with the minimum value of a functional was developed.

  9. Low NO(x) Combustor Development

    NASA Technical Reports Server (NTRS)

    Kastl, J. A.; Herberling, P. V.; Matulaitis, J. M.

    2005-01-01

    The goal of these efforts was the development of an ultra-low emissions, lean-burn combustor for the High Speed Civil Transport. The HSCT Mach 2.4 FLADE C1 Cycle was selected as the baseline engine cycle. A preliminary compilation of performance requirements for the HSCT combustor system was developed. The emissions goals of the program, baseline engine cycle, and standard combustor performance requirements were considered in developing the compilation of performance requirements. Seven combustor system designs were developed. The development of these system designs was facilitated by the use of spreadsheet-type models which predicted performance of the combustor systems over the entire flight envelope of the HSCT. A chemical kinetic model was developed for an LPP combustor and employed to study NO(x) formation kinetics, and CO burnout. These predictions helped to define the combustor residence time. Five fuel-air mixer concepts were analyzed for use in the combustor system designs. One of the seven system designs, one using the Swirl-Jet and Cyclone Swirler fuel-air mixers, was selected for a preliminary mechanical design study.

  10. Automatic translation of digraph to fault-tree models

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    1992-01-01

    The author presents a technique for converting digraph models, including those models containing cycles, to a fault-tree format. A computer program which automatically performs this translation using an object-oriented representation of the models has been developed. The fault-trees resulting from translations can be used for fault-tree analysis and diagnosis. Programs to calculate fault-tree and digraph cut sets and perform diagnosis with fault-tree models have also been developed. The digraph to fault-tree translation system has been successfully tested on several digraphs of varying size and complexity. Details of some representative translation problems are presented. Most of the computation performed by the program is dedicated to finding minimal cut sets for digraph nodes in order to break cycles in the digraph. Fault-trees produced by the translator have been successfully used with NASA's Fault-Tree Diagnosis System (FTDS) to produce automated diagnostic systems.

  11. The Use of Neural Network Technology to Model Swimming Performance

    PubMed Central

    Silva, António José; Costa, Aldo Manuel; Oliveira, Paulo Moura; Reis, Victor Machado; Saavedra, José; Perl, Jurgen; Rouboa, Abel; Marinho, Daniel Almeida

    2007-01-01

    The aims of the present study were: to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons) and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females) of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility), swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics) and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron) with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances) is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports. Key pointsThe non-linear analysis resulting from the use of feed forward neural network allowed us the development of four performance models.The mean difference between the true and estimated results performed by each one of the four neural network models constructed was low.The neural network tool can be a good approach in the resolution of the performance modeling as an alternative to the standard statistical models that presume well-defined distributions and independence among all inputs.The use of neural networks for sports sciences application allowed us to create very realistic models for swimming performance prediction based on previous selected criterions that were related with the dependent variable (performance). PMID:24149233

  12. Engineered Barrier System: Physical and Chemical Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less

  13. Spatial prediction of landslides using a hybrid machine learning approach based on Random Subspace and Classification and Regression Trees

    NASA Astrophysics Data System (ADS)

    Pham, Binh Thai; Prakash, Indra; Tien Bui, Dieu

    2018-02-01

    A hybrid machine learning approach of Random Subspace (RSS) and Classification And Regression Trees (CART) is proposed to develop a model named RSSCART for spatial prediction of landslides. This model is a combination of the RSS method which is known as an efficient ensemble technique and the CART which is a state of the art classifier. The Luc Yen district of Yen Bai province, a prominent landslide prone area of Viet Nam, was selected for the model development. Performance of the RSSCART model was evaluated through the Receiver Operating Characteristic (ROC) curve, statistical analysis methods, and the Chi Square test. Results were compared with other benchmark landslide models namely Support Vector Machines (SVM), single CART, Naïve Bayes Trees (NBT), and Logistic Regression (LR). In the development of model, ten important landslide affecting factors related with geomorphology, geology and geo-environment were considered namely slope angles, elevation, slope aspect, curvature, lithology, distance to faults, distance to rivers, distance to roads, and rainfall. Performance of the RSSCART model (AUC = 0.841) is the best compared with other popular landslide models namely SVM (0.835), single CART (0.822), NBT (0.821), and LR (0.723). These results indicate that performance of the RSSCART is a promising method for spatial landslide prediction.

  14. Translation from UML to Markov Model: A Performance Modeling Framework

    NASA Astrophysics Data System (ADS)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  15. Toward Realism in Human Performance Simulation

    DTIC Science & Technology

    2004-01-01

    toward the development of improved human-like performance of synthetic agents. However, several serious problems continue to challenge researchers and... developers . Developers have insufficient behavioral knowledge. To date, models of emotivity and behavior that have been commercialized still tend...Bindiganavale, 1999). There has even been significant development of architectures to produce animated characters that react appropriately to a small

  16. The self-care behavior inventory: a model for behavioral instrument development.

    PubMed

    McLaughlin, J; Sliepcevich, E M

    1985-09-01

    The Self-Care Behavior (SCB) Inventory was developed as part of a long-term study of self-care practices of persons who have multiple sclerosis (MS) in Denmark. The universe of behaviors regarding the physical, social, emotional, environmental, and spiritual aspects of coping with the illness was ascertained by informal and formal interviews. Respondents were asked not only what behavior was performed, but also who performed it, how it was performed, why, when, and where it was performed, and where the knowledge to perform the behavior in this manner was acquired, such as a lay-referral network, physician, social worker, spouse, or media. The inventory went through a series of drafts and pre-tests, resulting in a final version that met criteria for validity and reliability. The model presented for the development of the SCB Inventory can be useful for designing behavioral inventories and assessment tools for other chronic conditions such as arthritis, epilepsy, and diabetes.

  17. Modelling innovation performance of European regions using multi-output neural networks

    PubMed Central

    Henriques, Roberto

    2017-01-01

    Regional innovation performance is an important indicator for decision-making regarding the implementation of policies intended to support innovation. However, patterns in regional innovation structures are becoming increasingly diverse, complex and nonlinear. To address these issues, this study aims to develop a model based on a multi-output neural network. Both intra- and inter-regional determinants of innovation performance are empirically investigated using data from the 4th and 5th Community Innovation Surveys of NUTS 2 (Nomenclature of Territorial Units for Statistics) regions. The results suggest that specific innovation strategies must be developed based on the current state of input attributes in the region. Thus, it is possible to develop appropriate strategies and targeted interventions to improve regional innovation performance. We demonstrate that support of entrepreneurship is an effective instrument of innovation policy. We also provide empirical support that both business and government R&D activity have a sigmoidal effect, implying that the most effective R&D support should be directed to regions with below-average and average R&D activity. We further show that the multi-output neural network outperforms traditional statistical and machine learning regression models. In general, therefore, it seems that the proposed model can effectively reflect both the multiple-output nature of innovation performance and the interdependency of the output attributes. PMID:28968449

  18. Modelling innovation performance of European regions using multi-output neural networks.

    PubMed

    Hajek, Petr; Henriques, Roberto

    2017-01-01

    Regional innovation performance is an important indicator for decision-making regarding the implementation of policies intended to support innovation. However, patterns in regional innovation structures are becoming increasingly diverse, complex and nonlinear. To address these issues, this study aims to develop a model based on a multi-output neural network. Both intra- and inter-regional determinants of innovation performance are empirically investigated using data from the 4th and 5th Community Innovation Surveys of NUTS 2 (Nomenclature of Territorial Units for Statistics) regions. The results suggest that specific innovation strategies must be developed based on the current state of input attributes in the region. Thus, it is possible to develop appropriate strategies and targeted interventions to improve regional innovation performance. We demonstrate that support of entrepreneurship is an effective instrument of innovation policy. We also provide empirical support that both business and government R&D activity have a sigmoidal effect, implying that the most effective R&D support should be directed to regions with below-average and average R&D activity. We further show that the multi-output neural network outperforms traditional statistical and machine learning regression models. In general, therefore, it seems that the proposed model can effectively reflect both the multiple-output nature of innovation performance and the interdependency of the output attributes.

  19. Development of an Integrated Team Training Design and Assessment Architecture to Support Adaptability in Healthcare Teams

    DTIC Science & Technology

    2016-10-01

    and implementation of embedded, adaptive feedback and performance assessment. The investigators also initiated work designing a Bayesian Belief ...training; Teamwork; Adaptive performance; Leadership; Simulation; Modeling; Bayesian belief networks (BBN) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...Trauma teams Team training Teamwork Adaptability Adaptive performance Leadership Simulation Modeling Bayesian belief networks (BBN) 6

  20. What Are Error Rates for Classifying Teacher and School Performance Using Value-Added Models?

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Chiang, Hanley S.

    2013-01-01

    This article addresses likely error rates for measuring teacher and school performance in the upper elementary grades using value-added models applied to student test score gain data. Using a realistic performance measurement system scheme based on hypothesis testing, the authors develop error rate formulas based on ordinary least squares and…

  1. A Mathematical Model Development for the Lateral Collapse of Octagonal Tubes

    NASA Astrophysics Data System (ADS)

    Ghazali Kamardan, M.; Sufahani, Suliadi; Othman, M. Z. M.; Che-Him, Norziha; Khalid, Kamil; Roslan, Rozaini; Ali, Maselan; Zaidi, A. M. A.

    2018-04-01

    Many researches has been done on the lateral collapse of tube. However, the previous researches only focus on cylindrical and square tubes. Then a research has been done discovering the collapse behaviour of hexagonal tube and the mathematic model of the deformation behaviour had been developed [8]. The purpose of this research is to study the lateral collapse behaviour of symmetric octagonal tubes and hence to develop a mathematical model of the collapse behaviour of these tubes. For that, a predictive mathematical model was developed and a finite element analysis procedure was conducted for the lateral collapse behaviour of symmetric octagonal tubes. Lastly, the mathematical model was verified by using the finite element analysis simulation results. It was discovered that these tubes performed different deformation behaviour than the cylindrical tube. Symmetric octagonal tubes perform 2 phases of elastic - plastic deformation behaviour patterns. The mathematical model had managed to show the fundamental of the deformation behaviour of octagonal tubes. However, further studies need to be conducted in order to further improve on the proposed mathematical model.

  2. Evaluating Performance Measurement Systems in Nonprofit Agencies: The Program Accountability Quality Scale (PAQS).

    ERIC Educational Resources Information Center

    Poole, Dennis L.; Nelson, Joan; Carnahan, Sharon; Chepenik, Nancy G.; Tubiak, Christine

    2000-01-01

    Developed and field tested the Performance Accountability Quality Scale (PAQS) on 191 program performance measurement systems developed by nonprofit agencies in central Florida. Preliminary findings indicate that the PAQS provides a structure for obtaining expert opinions based on a theory-driven model about the quality of proposed measurement…

  3. Embedded Multiprocessor Technology for VHSIC Insertion

    NASA Technical Reports Server (NTRS)

    Hayes, Paul J.

    1990-01-01

    Viewgraphs on embedded multiprocessor technology for VHSIC insertion are presented. The objective was to develop multiprocessor system technology providing user-selectable fault tolerance, increased throughput, and ease of application representation for concurrent operation. The approach was to develop graph management mapping theory for proper performance, model multiprocessor performance, and demonstrate performance in selected hardware systems.

  4. Modeling ready biodegradability of fragrance materials.

    PubMed

    Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola

    2015-06-01

    In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.

  5. A Methodological Framework for Instructional Design Model Development: Critical Dimensions and Synthesized Procedures

    ERIC Educational Resources Information Center

    Lee, Jihyun; Jang, Seonyoung

    2014-01-01

    Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…

  6. Matrix population models as a tool in development of habitat models

    Treesearch

    Gregory D. Hayward; David B. McDonald

    1997-01-01

    Building sophisticated habitat models for conservation of owls must stem from an understanding of the relative quality of habitats at a variety of geographic and temporal scales. Developing these models requires knowing the relationship between habitat conditions and owl performance. What measure should be used to compare the quality of habitats? Matrix population...

  7. Modeling the Hydrologic Processes of a Permeable Pavement ...

    EPA Pesticide Factsheets

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has been developed in this study. The developed model can continuously simulate infiltration through the permeable pavement surface, exfiltration from the storage to the surrounding in situ soils, and clogging impacts on infiltration/exfiltration capacity at the pavement surface and the bottom of the subsurface storage unit. The exfiltration modeling component simulates vertical and horizontal exfiltration independently based on Darcy’s formula with the Green-Ampt approximation. The developed model can be arranged with physically-based modeling parameters, such as hydraulic conductivity, Manning’s friction flow parameters, saturated and field capacity volumetric water contents, porosity, density, etc. The developed model was calibrated using high-frequency observed data. The modeled water depths are well matched with the observed values (R2 = 0.90). The modeling results show that horizontal exfiltration through the side walls of the subsurface storage unit is a prevailing factor in determining the hydrologic performance of the system, especially where the storage unit is developed in a long, narrow shape; or with a high risk of bottom compaction and clogging. This paper presents unit

  8. A gunner model for an AAA tracking task with interrupted observations

    NASA Technical Reports Server (NTRS)

    Yu, C. F.; Wei, K. C.; Vikmanis, M.

    1982-01-01

    The problem of modeling a trained human operator's tracking performance in an anti-aircraft system under various display blanking conditions is discussed. The input to the gunner is the observable tracking error subjected to repeated interruptions (blanking). A simple and effective gunner model was developed. The effect of blanking on the gunner's tracking performance is approached via modeling the observer and controller gains.

  9. Review of Methods for Buildings Energy Performance Modelling

    NASA Astrophysics Data System (ADS)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting - replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance predictive model.

  10. Development of an Integrated Performance Measurement (PM) Model for Pharmaceutical Industry

    PubMed Central

    Shabaninejad, Hosein; Mirsalehian, Mohammad Hossein; Mehralian, Gholamhossein

    2014-01-01

    With respect to special characteristics of pharmaceutical industry and lack of reported performance measure, this study tries to design an integrated PM model for pharmaceutical companies. For generating this model; we first identified the key performance indicators (KPIs) and the key result indicators (KRIs) of a typical pharmaceutical company. Then, based on experts᾽ opinions, the identified indicators were ranked with respect to their importance, and the most important of them were selected to be used in the proposed model; In this model, we identified 25 KPIs and 12 KRIs. Although, this model is mostly appropriate to measure the performances of pharmaceutical companies, it can be also used to measure the performances of other industries with some modifications. We strongly recommend pharmaceutical managers to link these indicators with their payment and reward system, which can dramatically affect the performance of employees, and consequently their organization`s success. PMID:24711848

  11. A Model for Developing Meta-Cognitive Tools in Teacher Apprenticeships

    ERIC Educational Resources Information Center

    Bray, Paige; Schatz, Steven

    2013-01-01

    This research investigates a model for developing meta-cognitive tools to be used by pre-service teachers during apprenticeship (student teaching) experience to operationalise the epistemological model of Cook and Brown (2009). Meta-cognitive tools have proven to be effective for increasing performance and retention of undergraduate students.…

  12. Developing a Model for an Innovative Culinary Competency Curriculum and Examining Its Effects on Students' Performance

    ERIC Educational Resources Information Center

    Hu, Meng-Lei I-Chen Monica; Horng, Jeou-Shyan; Teng, Chih-Ching

    2016-01-01

    The present study designs and develops an innovative culinary competency curriculum (ICCC) model comprising seven sections: innovative culture, aesthetics, techniques, service, product, management, and creativity. The model is formulated based on culinary concept, creativity, innovation, and competency theory. The four elements of curriculum…

  13. Development and Calibration of Regional Dynamic Traffic Assignment Models for the Estimation of Traffic Performance Measures in Nevada

    DOT National Transportation Integrated Search

    2017-02-01

    This project covered the development and calibration of a Dynamic Traffic Assignment (DTA) model and explained the procedures, constraints, and considerations for usage of this model for the Reno-Sparks area roadway network in Northern Nevada. A lite...

  14. A Complete Procedure for Predicting and Improving the Performance of HAWT's

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Ali; Ertunç, Özgür; Sittig, Florian; Delgado, Antonio

    2014-06-01

    A complete procedure for predicting and improving the performance of the horizontal axis wind turbine (HAWT) has been developed. The first process is predicting the power extracted by the turbine and the derived rotor torque, which should be identical to that of the drive unit. The BEM method and a developed post-stall treatment for resolving stall-regulated HAWT is incorporated in the prediction. For that, a modified stall-regulated prediction model, which can predict the HAWT performance over the operating range of oncoming wind velocity, is derived from existing models. The model involves radius and chord, which has made it more general in applications for predicting the performance of different scales and rotor shapes of HAWTs. The second process is modifying the rotor shape by an optimization process, which can be applied to any existing HAWT, to improve its performance. A gradient- based optimization is used for adjusting the chord and twist angle distribution of the rotor blade to increase the extraction of the power while keeping the drive torque constant, thus the same drive unit can be kept. The final process is testing the modified turbine to predict its enhanced performance. The procedure is applied to NREL phase-VI 10kW as a baseline turbine. The study has proven the applicability of the developed model in predicting the performance of the baseline as well as the optimized turbine. In addition, the optimization method has shown that the power coefficient can be increased while keeping same design rotational speed.

  15. Numerical Investigation of the Effect of Radial Lip Seal Geometry on Sealing Performance

    NASA Astrophysics Data System (ADS)

    Tok, G.; Parlar, Z.; Temiz, V.

    2018-01-01

    Sealing elements are often needed in industry and especially in machine design. With the change and development of machine technology from day to day, sealing elements show continuous development and change in parallel with these developments. Many factors influence the performance of the sealing elements such as shaft surface roughness, radial force, lip geometry etc. In addition, the radial lip seals must have a certain pre-load and interference in order to provide a good sealing. This also affects the friction torque. Researchers are developing new seal designs to reduce friction losses in mechanical systems. In the presented study, the effect of the lip seal geometry on sealing performance will be examined numerically. The numerical model created for this purpose will be verified with experimental data firstly. In the numerical model, shaft and seal will be modeled as hyper-elastic in 2D and 3D. NBR (Nitrile Butadiene Rubber) as seal material will be analyzed for the rotating shaft state at constant speed by applying a uniform radial force.

  16. Hypnosis in sport: an Isomorphic Model.

    PubMed

    Robazza, C; Bortoli, L

    1994-10-01

    Hypnosis in sport can be applied according to an Isomorphic Model. Active-alert hypnosis is induced before or during practice whereas traditional hypnosis is induced after practice to establish connections between the two experiences. The fundamental goals are to (a) develop mental skills important to both motor and hypnotic performance, (b) supply a wide range of motor and hypnotic bodily experiences important to performance, and (c) induce alert hypnosis before or during performance. The model is based on the assumption that hypnosis and motor performance share common skills modifiable through training. Similarities between hypnosis and peak performance in the model are also considered. Some predictions are important from theoretical and practical points of view.

  17. Structural, Thermal, and Optical Performance (STOP) Modeling and Results for the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Gracey, Renee; Bartoszyk, Andrew; Cofie, Emmanuel; Comber, Brian; Hartig, George; Howard, Joseph; Sabatke, Derek; Wenzel, Greg; Ohl, Raymond

    2016-01-01

    The James Webb Space Telescope includes the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. We performed extensive structural, thermal, and optical performance(STOP) modeling in support of all phases of ISIM development. In this paper, we focus on modeling and results associated with test and verification. ISIMs test program is bound by ground environments, mostly notably the 1g and test chamber thermal environments. This paper describes STOP modeling used to predict ISIM system performance in 0g and at various on-orbit temperature environments. The predictions are used to project results obtained during testing to on-orbit performance.

  18. Axisymmetric whole pin life modelling of advanced gas-cooled reactor nuclear fuel

    NASA Astrophysics Data System (ADS)

    Mella, R.; Wenman, M. R.

    2013-06-01

    Thermo-mechanical contributions to pellet-clad interaction (PCI) in advanced gas-cooled reactors (AGRs) are modelled in the ABAQUS finite element (FE) code. User supplied sub-routines permit the modelling of the non-linear behaviour of AGR fuel through life. Through utilisation of ABAQUS's well-developed pre- and post-processing ability, the behaviour of the axially constrained steel clad fuel was modelled. The 2D axisymmetric model includes thermo-mechanical behaviour of the fuel with time and condition dependent material properties. Pellet cladding gap dynamics and thermal behaviour are also modelled. The model treats heat up as a fully coupled temperature-displacement study. Dwell time and direct power cycling was applied to model the impact of online refuelling, a key feature of the AGR. The model includes the visco-plastic behaviour of the fuel under the stress and irradiation conditions within an AGR core and a non-linear heat transfer model. A multiscale fission gas release model is applied to compute pin pressure; this model is coupled to the PCI gap model through an explicit fission gas inventory code. Whole pin, whole life, models are able to show the impact of the fuel on all segments of cladding including weld end caps and cladding pellet locking mechanisms (unique to AGR fuel). The development of this model in a commercial FE package shows that the development of a potentially verified and future-proof fuel performance code can be created and used. The usability of a FE based fuel performance code would be an enhancement over past codes. Pre- and post-processors have lowered the entry barrier for the development of a fuel performance model to permit the ability to model complicated systems. Typical runtimes for a 5 year axisymmetric model takes less than one hour on a single core workstation. The current model has implemented: Non-linear fuel thermal behaviour, including a complex description of heat flow in the fuel. Coupled with a variety of different FE and finite difference models. Non-linear mechanical behaviour of the fuel and cladding including, fuel creep and swelling and cladding creep and plasticity each with dependencies on a variety of different properties. A fission gas release model which takes inputs from first principles calculations. Explicitly integrated inventory calculations performed in a coupled manner. Freedom to model steady state and transient behaviour using implicit time integration. The whole pin geometry is considered over an entire typical fuel life. The model showed by examination of normal operation and a subsequent transient chosen for software demonstration purposes: ABAQUS may be a sufficiently flexible platform to develop a complete and verified fuel performance code. The importance and effectiveness of the geometry of the fuel spacer pellets was characterised. The fuels performance under normal conditions (high friction no power spikes) would not suggest serious degradation of the cladding in fuel life. Large plastic strains were found when pellet bonding was strong, these would appear at all pellets cladding triple points and all pellet radial crack and cladding interfaces thus showing a possible axial direction to cracks forming from ductility exhaustion.

  19. Artificial intelligence models for predicting the performance of biological wastewater treatment plant in the removal of Kjeldahl Nitrogen from wastewater

    NASA Astrophysics Data System (ADS)

    Manu, D. S.; Thalla, Arun Kumar

    2017-11-01

    The current work demonstrates the support vector machine (SVM) and adaptive neuro-fuzzy inference system (ANFIS) modeling to assess the removal efficiency of Kjeldahl Nitrogen of a full-scale aerobic biological wastewater treatment plant. The influent variables such as pH, chemical oxygen demand, total solids (TS), free ammonia, ammonia nitrogen and Kjeldahl Nitrogen are used as input variables during modeling. Model development focused on postulating an adaptive, functional, real-time and alternative approach for modeling the removal efficiency of Kjeldahl Nitrogen. The input variables used for modeling were daily time series data recorded at wastewater treatment plant (WWTP) located in Mangalore during the period June 2014-September 2014. The performance of ANFIS model developed using Gbell and trapezoidal membership functions (MFs) and SVM are assessed using different statistical indices like root mean square error, correlation coefficients (CC) and Nash Sutcliff error (NSE). The errors related to the prediction of effluent Kjeldahl Nitrogen concentration by the SVM modeling appeared to be reasonable when compared to that of ANFIS models with Gbell and trapezoidal MF. From the performance evaluation of the developed SVM model, it is observed that the approach is capable to define the inter-relationship between various wastewater quality variables and thus SVM can be potentially applied for evaluating the efficiency of aerobic biological processes in WWTP.

  20. An integrated physiology model to study regional lung damage effects and the physiologic response

    PubMed Central

    2014-01-01

    Background This work expands upon a previously developed exercise dynamic physiology model (DPM) with the addition of an anatomic pulmonary system in order to quantify the impact of lung damage on oxygen transport and physical performance decrement. Methods A pulmonary model is derived with an anatomic structure based on morphometric measurements, accounting for heterogeneous ventilation and perfusion observed experimentally. The model is incorporated into an existing exercise physiology model; the combined system is validated using human exercise data. Pulmonary damage from blast, blunt trauma, and chemical injury is quantified in the model based on lung fluid infiltration (edema) which reduces oxygen delivery to the blood. The pulmonary damage component is derived and calibrated based on published animal experiments; scaling laws are used to predict the human response to lung injury in terms of physical performance decrement. Results The augmented dynamic physiology model (DPM) accurately predicted the human response to hypoxia, altitude, and exercise observed experimentally. The pulmonary damage parameters (shunt and diffusing capacity reduction) were fit to experimental animal data obtained in blast, blunt trauma, and chemical damage studies which link lung damage to lung weight change; the model is able to predict the reduced oxygen delivery in damage conditions. The model accurately estimates physical performance reduction with pulmonary damage. Conclusions We have developed a physiologically-based mathematical model to predict performance decrement endpoints in the presence of thoracic damage; simulations can be extended to estimate human performance and escape in extreme situations. PMID:25044032

  1. Unnatural selection: talent identification and development in sport.

    PubMed

    Abbott, Angela; Button, Chris; Pepping, Gert-Jan; Collins, Dave

    2005-01-01

    The early identification of talented individuals has become increasingly important across many performance domains. Current talent identification (TI) schemes in sport typically select on the basis of discrete, unidimensional measures at unstable periods in the athlete's development. In this article, the concept of talent is revised as a complex, dynamical system in which future behaviors emerge from an interaction of key performance determinants such as psychological behaviors, motor abilities, and physical characteristics. Key nonlinear dynamics concepts are related to TI approaches such as sensitivity to initial conditions, transitions, and exponential behavioral distributions. It is concluded that many TI models place an overemphasis on early identification rather than the development of potentially talented performers. A generic model of talent identification and development is proposed that addresses these issues and provides direction for future research.

  2. Leveraging simulation to evaluate system performance in presence of fixed pattern noise

    NASA Astrophysics Data System (ADS)

    Teaney, Brian P.

    2017-05-01

    The development of image simulation techniques which map the effects of a notional, modeled sensor system onto an existing image can be used to evaluate the image quality of camera systems prior to the development of prototype systems. In addition, image simulation or `virtual prototyping' can be utilized to reduce the time and expense associated with conducting extensive field trials. In this paper we examine the development of a perception study designed to assess the performance of the NVESD imager performance metrics as a function of fixed pattern noise. This paper discusses the development of the model theory and the implementation and execution of the perception study. In addition, other applications of the image simulation component including the evaluation of limiting resolution and other test targets is provided.

  3. Development of a Pavement Maintenance Management System. Volume 9. Development of Airfield Pavement Performance Prediction Models.

    DTIC Science & Technology

    1984-05-01

    materials, traffic, and climate, were used to develop PCI and key distress prediction models for both asphalt-concrete- and jointed-concrete- surfaced...Predicted PCI for PCC and AC/PCC Pavements Using Model Presented in Section III ...... 35 31 Effect of PCC Thickness on the PCI as a Function of Age...of Corner Breaking Observed vs Predicted Percent of Corner Breaking Using Model Presented in Section III

  4. Using finite element modelling and experimental methods to investigate planar coil sensor topologies for inductive measurement of displacement

    NASA Astrophysics Data System (ADS)

    Moreton, Gregory; Meydan, Turgut; Williams, Paul

    2018-04-01

    The usage of planar sensors is widespread due to their non-contact nature and small size profiles, however only a few basic design types are generally considered. In order to develop planar coil designs we have performed extensive finite element modelling (FEM) and experimentation to understand the performance of different planar sensor topologies when used in inductive sensing. We have applied this approach to develop a novel displacement sensor. Models of different topologies with varying pitch values have been analysed using the ANSYS Maxwell FEM package, furthermore the models incorporated a movable soft magnetic amorphous ribbon element. The different models used in the FEM were then constructed and experimentally tested with topologies that included mesh, meander, square coil, and circular coil configurations. The sensors were used to detect the displacement of the amorphous ribbon. A LabView program controlled both the displacement stage and the impedance analyser, the latter capturing the varying inductance values with ribbon displacement. There was good correlation between the FEM models and the experimental data confirming that the methodology described here offers an effective way for developing planar coil based sensors with improved performance.

  5. Effect of environmental factors on pavement deterioration : Final report, Volume II of II

    DOT National Transportation Integrated Search

    1988-11-01

    A computerized model for the determination of pavement deterioration responsibilities due to load and non-load related factors was developed. The model is based on predicted pavement performance and the relationship of pavement performance to a quant...

  6. Effect of environmental factors on pavement deterioration : Final report, Volume I of II.

    DOT National Transportation Integrated Search

    1988-11-01

    A computerized model for the determination of pavement deterioration responsibilities due to load and non-load related factors was developed. The model is based on predicted pavement performance and the relationship of pavement performance to a quant...

  7. Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach

    PubMed Central

    Kneifel, Joshua; Webb, David

    2016-01-01

    Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF. PMID:27956756

  8. Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach.

    PubMed

    Kneifel, Joshua; Webb, David

    2016-09-01

    Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF.

  9. The development of comparative bias index

    NASA Astrophysics Data System (ADS)

    Aimran, Ahmad Nazim; Ahmad, Sabri; Afthanorhan, Asyraf; Awang, Zainudin

    2017-08-01

    Structural Equation Modeling (SEM) is a second generation statistical analysis techniques developed for analyzing the inter-relationships among multiple variables in a model simultaneously. There are two most common used methods in SEM namely Covariance-Based Structural Equation Modeling (CB-SEM) and Partial Least Square Path Modeling (PLS-PM). There have been continuous debates among researchers in the use of PLS-PM over CB-SEM. While there is few studies were conducted to test the performance of CB-SEM and PLS-PM bias in estimating simulation data. This study intends to patch this problem by a) developing the Comparative Bias Index and b) testing the performance of CB-SEM and PLS-PM using developed index. Based on balanced experimental design, two multivariate normal simulation data with of distinct specifications of size 50, 100, 200 and 500 are generated and analyzed using CB-SEM and PLS-PM.

  10. Membrane dish analysis: A summary of structural and optical analysis capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steele, C.R.; Balch, C.D.; Jorgensen, G.J.

    Research at SERI within the Department of Energy's Solar Thermal Technology Program has focused on the development of membrane dish concentrators for space and terrestrial power applications. As potentially lightweight, inexpensive, high-performance structures, they are excellent candidates for space-deployable energy sources as well as cost-effective terrestrial energy concepts. A thorough engineering research treatment of these types of structures consists primarily of two parts: (1) structural mechanics of the membrane and ring support and (2) analysis and characterization of the concentrator optical performance. It is important to understand the effects of the membrane's structure and support system on the optical performancemore » of the concentrator. This requires an interface between appropriate structural and optical models. Until recently, such models and the required interface have not existed. This report documents research that has been conducted at SERI in this area. It is a compilation of several papers describing structural models of membrane dish structures and optical models used to predict dish concentrator optical and thermal performance. The structural models were developed under SERI subcontract by Dr. Steele and Dr. Balch of Stanford University. The optical model was developed in-house by SERI staff. In addition, the interface between the models is described. It allows easy and thorough characterization of membrane dish systems from the mechanics to the resulting optical performance. The models described herein have been and continue to be extremely useful to SERI, industry, and universities involved with the modeling and analysis of lightweight membrane concentrators for solar thermal applications.« less

  11. The fourth radiation transfer model intercomparison (RAMI-IV): Proficiency testing of canopy reflectance models with ISO-13528

    NASA Astrophysics Data System (ADS)

    Widlowski, J.-L.; Pinty, B.; Lopatka, M.; Atzberger, C.; Buzica, D.; Chelle, M.; Disney, M.; Gastellu-Etchegorry, J.-P.; Gerboles, M.; Gobron, N.; Grau, E.; Huang, H.; Kallel, A.; Kobayashi, H.; Lewis, P. E.; Qin, W.; Schlerf, M.; Stuckens, J.; Xie, D.

    2013-07-01

    The radiation transfer model intercomparison (RAMI) activity aims at assessing the reliability of physics-based radiative transfer (RT) models under controlled experimental conditions. RAMI focuses on computer simulation models that mimic the interactions of radiation with plant canopies. These models are increasingly used in the development of satellite retrieval algorithms for terrestrial essential climate variables (ECVs). Rather than applying ad hoc performance metrics, RAMI-IV makes use of existing ISO standards to enhance the rigor of its protocols evaluating the quality of RT models. ISO-13528 was developed "to determine the performance of individual laboratories for specific tests or measurements." More specifically, it aims to guarantee that measurement results fall within specified tolerance criteria from a known reference. Of particular interest to RAMI is that ISO-13528 provides guidelines for comparisons where the true value of the target quantity is unknown. In those cases, "truth" must be replaced by a reliable "conventional reference value" to enable absolute performance tests. This contribution will show, for the first time, how the ISO-13528 standard developed by the chemical and physical measurement communities can be applied to proficiency testing of computer simulation models. Step by step, the pre-screening of data, the identification of reference solutions, and the choice of proficiency statistics will be discussed and illustrated with simulation results from the RAMI-IV "abstract canopy" scenarios. Detailed performance statistics of the participating RT models will be provided and the role of the accuracy of the reference solutions as well as the choice of the tolerance criteria will be highlighted.

  12. A high-resolution global flood hazard model

    NASA Astrophysics Data System (ADS)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  13. High Performance Computing for Modeling Wind Farms and Their Impact

    NASA Astrophysics Data System (ADS)

    Mavriplis, D.; Naughton, J. W.; Stoellinger, M. K.

    2016-12-01

    As energy generated by wind penetrates further into our electrical system, modeling of power production, power distribution, and the economic impact of wind-generated electricity is growing in importance. The models used for this work can range in fidelity from simple codes that run on a single computer to those that require high performance computing capabilities. Over the past several years, high fidelity models have been developed and deployed on the NCAR-Wyoming Supercomputing Center's Yellowstone machine. One of the primary modeling efforts focuses on developing the capability to compute the behavior of a wind farm in complex terrain under realistic atmospheric conditions. Fully modeling this system requires the simulation of continental flows to modeling the flow over a wind turbine blade, including down to the blade boundary level, fully 10 orders of magnitude in scale. To accomplish this, the simulations are broken up by scale, with information from the larger scales being passed to the lower scale models. In the code being developed, four scale levels are included: the continental weather scale, the local atmospheric flow in complex terrain, the wind plant scale, and the turbine scale. The current state of the models in the latter three scales will be discussed. These simulations are based on a high-order accurate dynamic overset and adaptive mesh approach, which runs at large scale on the NWSC Yellowstone machine. A second effort on modeling the economic impact of new wind development as well as improvement in wind plant performance and enhancements to the transmission infrastructure will also be discussed.

  14. A Model of Desired Performance in Phylogenetic Tree Construction for Teaching Evolution.

    ERIC Educational Resources Information Center

    Brewer, Steven D.

    This research paper examines phylogenetic tree construction-a form of problem solving in biology-by studying the strategies and heuristics used by experts. One result of the research is the development of a model of desired performance for phylogenetic tree construction. A detailed description of the model and the sample problems which illustrate…

  15. Investigating different approaches to develop informative priors in hierarchical Bayesian safety performance functions.

    PubMed

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-07-01

    The Bayesian inference method has been frequently adopted to develop safety performance functions. One advantage of the Bayesian inference is that prior information for the independent variables can be included in the inference procedures. However, there are few studies that discussed how to formulate informative priors for the independent variables and evaluated the effects of incorporating informative priors in developing safety performance functions. This paper addresses this deficiency by introducing four approaches of developing informative priors for the independent variables based on historical data and expert experience. Merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance information criterion (DIC), R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparison across the models indicated that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. Furthermore, informative priors for the inverse dispersion parameter have also been introduced and tested. Different types of informative priors' effects on the model estimations and goodness-of-fit have been compared and concluded. Finally, based on the results, recommendations for future research topics and study applications have been made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Regional analyses of labor markets and demography: a model based Norwegian example.

    PubMed

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  17. Feature Extraction of Event-Related Potentials Using Wavelets: An Application to Human Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Shensa, Mark J.; Remington, Roger W. (Technical Monitor)

    1998-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many f ree parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation,-, algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance.

  18. Feature extraction of event-related potentials using wavelets: an application to human performance monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, L. J.; Shensa, M. J.

    1999-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.

  19. Building the infrastructure: the effects of role identification behaviors on team cognition development and performance.

    PubMed

    Pearsall, Matthew J; Ellis, Aleksander P J; Bell, Bradford S

    2010-01-01

    The primary purpose of this study was to extend theory and research regarding the emergence of mental models and transactive memory in teams. Utilizing Kozlowski, Gully, Nason, and Smith's (1999) model of team compilation, we examined the effect of role identification behaviors and posited that such behaviors represent the initial building blocks of team cognition during the role compilation phase of team development. We then hypothesized that team mental models and transactive memory would convey the effects of these behaviors onto team performance in the team compilation phase of development. Results from 60 teams working on a command-and-control simulation supported our hypotheses. Copyright 2009 APA, all rights reserved.

  20. A systematic review of predictive models for asthma development in children.

    PubMed

    Luo, Gang; Nkoy, Flory L; Stone, Bryan L; Schmick, Darell; Johnson, Michael D

    2015-11-28

    Asthma is the most common pediatric chronic disease affecting 9.6 % of American children. Delay in asthma diagnosis is prevalent, resulting in suboptimal asthma management. To help avoid delay in asthma diagnosis and advance asthma prevention research, researchers have proposed various models to predict asthma development in children. This paper reviews these models. A systematic review was conducted through searching in PubMed, EMBASE, CINAHL, Scopus, the Cochrane Library, the ACM Digital Library, IEEE Xplore, and OpenGrey up to June 3, 2015. The literature on predictive models for asthma development in children was retrieved, with search results limited to human subjects and children (birth to 18 years). Two independent reviewers screened the literature, performed data extraction, and assessed article quality. The literature search returned 13,101 references in total. After manual review, 32 of these references were determined to be relevant and are discussed in the paper. We identify several limitations of existing predictive models for asthma development in children, and provide preliminary thoughts on how to address these limitations. Existing predictive models for asthma development in children have inadequate accuracy. Efforts to improve these models' performance are needed, but are limited by a lack of a gold standard for asthma development in children.

  1. MARMOT update for oxide fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yongfeng; Schwen, Daniel; Chakraborty, Pritam

    This report summarizes the lower-length-scale research and development progresses in FY16 at Idaho National Laboratory in developing mechanistic materials models for oxide fuels, in parallel to the development of the MARMOT code which will be summarized in a separate report. This effort is a critical component of the microstructure based fuel performance modeling approach, supported by the Fuels Product Line in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The progresses can be classified into three categories: 1) development of materials models to be used in engineering scale fuel performance modeling regarding the effect of lattice defects on thermal conductivity, 2) development of modeling capabilities for mesoscale fuel behaviors including stage-3 gas release, grain growth, high burn-up structure, fracture and creep, and 3) improved understanding in material science by calculating the anisotropic grain boundary energies in UOmore » $$_2$$ and obtaining thermodynamic data for solid fission products. Many of these topics are still under active development. They are updated in the report with proper amount of details. For some topics, separate reports are generated in parallel and so stated in the text. The accomplishments have led to better understanding of fuel behaviors and enhance capability of the MOOSE-BISON-MARMOT toolkit.« less

  2. Study to design and develop remote manipulator systems

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Salisbury, J. K., Jr.

    1977-01-01

    A description is given of part of a continuing effort both to develop models for and to augment the performance of humans controlling remote manipulators. The project plan calls for the performance of several standard tasks with a number of different manipulators, controls, and viewing conditions, using an automated performance measuring system; in addition, the project plan calls for the development of a force-reflecting joystick and supervisory display system.

  3. Development of a computational model for astronaut reorientation.

    PubMed

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  4. A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance

    NASA Astrophysics Data System (ADS)

    Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos

    2007-12-01

    In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.

  5. Benchmarking novel approaches for modelling species range dynamics

    PubMed Central

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H.; Moore, Kara A.; Zimmermann, Niklaus E.

    2016-01-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species’ range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species’ response to climate change but also emphasise several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. PMID:26872305

  6. Benchmarking novel approaches for modelling species range dynamics.

    PubMed

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. © 2016 John Wiley & Sons Ltd.

  7. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  8. Performance assessment of geospatial simulation models of land-use change--a landscape metric-based approach.

    PubMed

    Sakieh, Yousef; Salmanmahiny, Abdolrassoul

    2016-03-01

    Performance evaluation is a critical step when developing land-use and cover change (LUCC) models. The present study proposes a spatially explicit model performance evaluation method, adopting a landscape metric-based approach. To quantify GEOMOD model performance, a set of composition- and configuration-based landscape metrics including number of patches, edge density, mean Euclidean nearest neighbor distance, largest patch index, class area, landscape shape index, and splitting index were employed. The model takes advantage of three decision rules including neighborhood effect, persistence of change direction, and urbanization suitability values. According to the results, while class area, largest patch index, and splitting indices demonstrated insignificant differences between spatial pattern of ground truth and simulated layers, there was a considerable inconsistency between simulation results and real dataset in terms of the remaining metrics. Specifically, simulation outputs were simplistic and the model tended to underestimate number of developed patches by producing a more compact landscape. Landscape-metric-based performance evaluation produces more detailed information (compared to conventional indices such as the Kappa index and overall accuracy) on the model's behavior in replicating spatial heterogeneity features of a landscape such as frequency, fragmentation, isolation, and density. Finally, as the main characteristic of the proposed method, landscape metrics employ the maximum potential of observed and simulated layers for a performance evaluation procedure, provide a basis for more robust interpretation of a calibration process, and also deepen modeler insight into the main strengths and pitfalls of a specific land-use change model when simulating a spatiotemporal phenomenon.

  9. Design and implementation of ergonomic performance measurement system at a steel plant in India.

    PubMed

    Ray, Pradip Kumar; Tewari, V K

    2012-01-01

    Management of Tata Steel, the largest steel making company of India in the private sector, felt the need to develop a framework to determine the levels of ergonomic performance at its different workplaces. The objectives of the study are manifold: to identify and characterize the ergonomic variables for a given worksystem with regard to work efficiency, operator safety, and working conditions, to design a comprehensive Ergonomic Performance Indicator (EPI) for quantitative determination of the ergonomic status and maturity of a given worksystem. The study team of IIT Kharagpur consists of three faculty members and the management of Tata Steel formed a team of eleven members for implementation of EPI model. In order to design and develop the EPI model with total participation and understanding of the concerned personnel of Tata Steel, a three-phase action plan for the project was prepared. The project consists of three phases: preparation and data collection, detailed structuring and validation of EPI model. Identification of ergonomic performance factors, development of interaction matrix, design of assessment tool, and testing and validation of assessment tool (EPI) in varied situations are the major steps in these phases. The case study discusses in detail the EPI model and its applications.

  10. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  11. Real-time flood forecasting by employing artificial neural network based model with zoning matching approach

    NASA Astrophysics Data System (ADS)

    Sulaiman, M.; El-Shafie, A.; Karim, O.; Basri, H.

    2011-10-01

    Flood forecasting models are a necessity, as they help in planning for flood events, and thus help prevent loss of lives and minimize damage. At present, artificial neural networks (ANN) have been successfully applied in river flow and water level forecasting studies. ANN requires historical data to develop a forecasting model. However, long-term historical water level data, such as hourly data, poses two crucial problems in data training. First is that the high volume of data slows the computation process. Second is that data training reaches its optimal performance within a few cycles of data training, due to there being a high volume of normal water level data in the data training, while the forecasting performance for high water level events is still poor. In this study, the zoning matching approach (ZMA) is used in ANN to accurately monitor flood events in real time by focusing the development of the forecasting model on high water level zones. ZMA is a trial and error approach, where several training datasets using high water level data are tested to find the best training dataset for forecasting high water level events. The advantage of ZMA is that relevant knowledge of water level patterns in historical records is used. Importantly, the forecasting model developed based on ZMA successfully achieves high accuracy forecasting results at 1 to 3 h ahead and satisfactory performance results at 6 h. Seven performance measures are adopted in this study to describe the accuracy and reliability of the forecasting model developed.

  12. Flowfield characterization and model development in detonation tubes

    NASA Astrophysics Data System (ADS)

    Owens, Zachary Clark

    A series of experiments and numerical simulations are performed to advance the understanding of flowfield phenomena and impulse generation in detonation tubes. Experiments employing laser-based velocimetry, high-speed schlieren imaging and pressure measurements are used to construct a dataset against which numerical models can be validated. The numerical modeling culminates in the development of a two-dimensional, multi-species, finite-rate-chemistry, parallel, Navier-Stokes solver. The resulting model is specifically designed to assess unsteady, compressible, reacting flowfields, and its utility for studying multidimensional detonation structure is demonstrated. A reduced, quasi-one-dimensional model with source terms accounting for wall losses is also developed for rapid parametric assessment. Using these experimental and numerical tools, two primary objectives are pursued. The first objective is to gain an understanding of how nozzles affect unsteady, detonation flowfields and how they can be designed to maximize impulse in a detonation based propulsion system called a pulse detonation engine. It is shown that unlike conventional, steady-flow propulsion systems where converging-diverging nozzles generate optimal performance, unsteady detonation tube performance during a single-cycle is maximized using purely diverging nozzles. The second objective is to identify the primary underlying mechanisms that cause velocity and pressure measurements to deviate from idealized theory. An investigation of the influence of non-ideal losses including wall heat transfer, friction and condensation leads to the development of improved models that reconcile long-standing discrepancies between predicted and measured detonation tube performance. It is demonstrated for the first time that wall condensation of water vapor in the combustion products can cause significant deviations from ideal theory.

  13. Numerical and analytical investigation towards performance enhancement of a newly developed rockfall protective cable-net structure

    NASA Astrophysics Data System (ADS)

    Dhakal, S.; Bhandary, N. P.; Yatabe, R.; Kinoshita, N.

    2012-04-01

    In a previous companion paper, we presented a three-tier modelling of a particular type of rockfall protective cable-net structure (barrier), developed newly in Japan. Therein, we developed a three-dimensional, Finite Element based, nonlinear numerical model having been calibrated/back-calculated and verified with the element- and structure-level physical tests. Moreover, using a very simple, lumped-mass, single-degree-of-freedom, equivalently linear analytical model, a global-displacement-predictive correlation was devised by modifying the basic equation - obtained by combining the principles of conservation of linear momentum and energy - based on the back-analysis of the tests on the numerical model. In this paper, we use the developed models to explore the performance enhancement potential of the structure in terms of (a) the control of global displacement - possibly the major performance criterion for the proposed structure owing to a narrow space available in the targeted site, and (b) the increase in energy dissipation by the existing U-bolt-type Friction-brake Devices - which are identified to have performed weakly when integrated into the structure. A set of parametric investigations have revealed correlations to achieve the first objective in terms of the structure's mass, particularly by manipulating the wire-net's characteristics, and has additionally disclosed the effects of the impacting-block's parameters. Towards achieving the second objective, another set of parametric investigations have led to a proposal of a few innovative improvements in the constitutive behaviour (model) of the studied brake device (dissipator), in addition to an important recommendation of careful handling of the device based on the identified potential flaw.

  14. Developing sustainability: a new metaphor for progress.

    PubMed

    Bensimon, Cécile M; Benatar, Solomon R

    2006-01-01

    In this paper, we propose a new model for development, one that transcends the North-South dichotomy and goes beyond a narrow conception of development as an economic process. This model requires a paradigm shift toward a new metaphor that develops sustainability, rather than sustains development. We conclude by defending a 'report card on development' as a means for evaluating how countries perform within this new paradigm.

  15. Competency-Based Human Resource Development Strategy

    ERIC Educational Resources Information Center

    Gangani, Noordeen T.; McLean, Gary N.; Braden, Richard A.

    2004-01-01

    This paper explores issues in developing and implementing a competency-based human resource development strategy. The paper summarizes a literature review on how competency models can improve HR performance. A case study is presented of American Medical Systems (AMS), a mid-sized health-care and medical device company, where the model is being…

  16. Uncovering curvilinear relationships between conscientiousness and job performance: how theoretically appropriate measurement makes an empirical difference.

    PubMed

    Carter, Nathan T; Dalal, Dev K; Boyce, Anthony S; O'Connell, Matthew S; Kung, Mei-Chuan; Delgado, Kristin M

    2014-07-01

    The personality trait of conscientiousness has seen considerable attention from applied psychologists due to its efficacy for predicting job performance across performance dimensions and occupations. However, recent theoretical and empirical developments have questioned the assumption that more conscientiousness always results in better job performance, suggesting a curvilinear link between the 2. Despite these developments, the results of studies directly testing the idea have been mixed. Here, we propose this link has been obscured by another pervasive assumption known as the dominance model of measurement: that higher scores on traditional personality measures always indicate higher levels of conscientiousness. Recent research suggests dominance models show inferior fit to personality test scores as compared to ideal point models that allow for curvilinear relationships between traits and scores. Using data from 2 different samples of job incumbents, we show the rank-order changes that result from using an ideal point model expose a curvilinear link between conscientiousness and job performance 100% of the time, whereas results using dominance models show mixed results, similar to the current state of the literature. Finally, with an independent cross-validation sample, we show that selection based on predicted performance using ideal point scores results in more favorable objective hiring outcomes. Implications for practice and future research are discussed.

  17. Explaining match outcome in elite Australian Rules football using team performance indicators.

    PubMed

    Robertson, Sam; Back, Nicole; Bartlett, Jonathan D

    2016-01-01

    The relationships between team performance indicators and match outcome have been examined in many team sports, however are limited in Australian Rules football. Using data from the 2013 and 2014 Australian Football League (AFL) regular seasons, this study assessed the ability of commonly reported discrete team performance indicators presented in their relative form (standardised against their opposition for a given match) to explain match outcome (Win/Loss). Logistic regression and decision tree (chi-squared automatic interaction detection (CHAID)) analyses both revealed relative differences between opposing teams for "kicks" and "goal conversion" as the most influential in explaining match outcome, with two models achieving 88.3% and 89.8% classification accuracies, respectively. Models incorporating a smaller performance indicator set displayed a slightly reduced ability to explain match outcome (81.0% and 81.5% for logistic regression and CHAID, respectively). However, both were fit to 2014 data with reduced error in comparison to the full models. Despite performance similarities across the two analysis approaches, the CHAID model revealed multiple winning performance indicator profiles, thereby increasing its comparative feasibility for use in the field. Coaches and analysts may find these results useful in informing strategy and game plan development in Australian Rules football, with the development of team-specific models recommended in future.

  18. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  19. Fatigue models for applied research in warfighting.

    PubMed

    Hursh, Steven R; Redmond, Daniel P; Johnson, Michael L; Thorne, David R; Belenky, Gregory; Balkin, Thomas J; Storm, William F; Miller, James C; Eddy, Douglas R

    2004-03-01

    The U.S. Department of Defense (DOD) has long pursued applied research concerning fatigue in sustained and continuous military operations. In 1996, Hursh developed a simple homeostatic fatigue model and programmed the model into an actigraph to give a continuous indication of performance. Based on this initial work, the Army conducted a study of 1 wk of restricted sleep in 66 subjects with multiple measures of performance, termed the Sleep Dose-Response Study (SDR). This study provided numerical estimation of parameters for the Walter Reed Army Institute of Research Sleep Performance Model (SPM) and elucidated the relationships among several sleep-related performance measures. Concurrently, Hursh extended the original actigraph modeling structure and software expressions for use in other practical applications. The model became known as the Sleep, Activity, Fatigue, and Task Effectiveness (SAFTE) Model, and Hursh has applied it in the construction of a Fatigue Avoidance Scheduling Tool. This software is designed to help optimize the operational management of aviation ground and flight crews, but is not limited to that application. This paper describes the working fatigue model as it is being developed by the DOD laboratories, using the conceptual framework, vernacular, and notation of the SAFTE Model. At specific points where the SPM may differ from SAFTE, this is discussed. Extensions of the SAFTE Model to incorporate dynamic phase adjustment for both transmeridian relocation and shift work are described. The unexpected persistence of performance effects following chronic sleep restriction found in the SDR study necessitated some revisions of the SAFTE Model that are also described. The paper concludes with a discussion of several important modeling issues that remain to be addressed.

  20. Predicting BRCA1 and BRCA2 gene mutation carriers: comparison of LAMBDA, BRCAPRO, Myriad II, and modified Couch models.

    PubMed

    Lindor, Noralane M; Lindor, Rachel A; Apicella, Carmel; Dowty, James G; Ashley, Amanda; Hunt, Katherine; Mincey, Betty A; Wilson, Marcia; Smith, M Cathie; Hopper, John L

    2007-01-01

    Models have been developed to predict the probability that a person carries a detectable germline mutation in the BRCA1 or BRCA2 genes. Their relative performance in a clinical setting is unclear. To compare the performance characteristics of four BRCA1/BRCA2 gene mutation prediction models: LAMBDA, based on a checklist and scores developed from data on Ashkenazi Jewish (AJ) women; BRCAPRO, a Bayesian computer program; modified Couch tables based on regression analyses; and Myriad II tables collated by Myriad Genetics Laboratories. Family cancer history data were analyzed from 200 probands from the Mayo Clinic Familial Cancer Program, in a multispecialty tertiary care group practice. All probands had clinical testing for BRCA1 and BRCA2 mutations conducted in a single laboratory. For each model, performance was assessed by the area under the receiver operator characteristic curve (ROC) and by tests of accuracy and dispersion. Cases "missed" by one or more models (model predicted less than 10% probability of mutation when a mutation was actually found) were compared across models. All models gave similar areas under the ROC curve of 0.71 to 0.76. All models except LAMBDA substantially under-predicted the numbers of carriers. All models were too dispersed. In terms of ranking, all prediction models performed reasonably well with similar performance characteristics. Model predictions were widely discrepant for some families. Review of cancer family histories by an experienced clinician continues to be vital to ensure that critical elements are not missed and that the most appropriate risk prediction figures are provided.

  1. User's Manual for Data for Validating Models for PV Module Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marion, W.; Anderberg, A.; Deline, C.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  2. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  3. Balancing energy development and conservation: A method utilizing species distribution models

    USGS Publications Warehouse

    Jarnevich, C.S.; Laubhan, M.K.

    2011-01-01

    Alternative energy development is increasing, potentially leading to negative impacts on wildlife populations already stressed by other factors. Resource managers require a scientifically based methodology to balance energy development and species conservation, so we investigated modeling habitat suitability using Maximum Entropy to develop maps that could be used with other information to help site energy developments. We selected one species of concern, the Lesser Prairie-Chicken (LPCH; Tympanuchus pallidicinctus) found on the southern Great Plains of North America, as our case study. LPCH populations have been declining and are potentially further impacted by energy development. We used LPCH lek locations in the state of Kansas along with several environmental and anthropogenic parameters to develop models that predict the probability of lek occurrence across the landscape. The models all performed well as indicated by the high test area under the curve (AUC) scores (all >0.9). The inclusion of anthropogenic parameters in models resulted in slightly better performance based on AUC values, indicating that anthropogenic features may impact LPCH lek habitat suitability. Given the positive model results, this methodology may provide additional guidance in designing future survey protocols, as well as siting of energy development in areas of marginal or unsuitable habitat for species of concern. This technique could help to standardize and quantify the impacts various developments have upon at-risk species. ?? 2011 Springer Science+Business Media, LLC (outside the USA).

  4. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    PubMed

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  5. Stata Modules for Calculating Novel Predictive Performance Indices for Logistic Models.

    PubMed

    Barkhordari, Mahnaz; Padyab, Mojgan; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza

    2016-01-01

    Prediction is a fundamental part of prevention of cardiovascular diseases (CVD). The development of prediction algorithms based on the multivariate regression models loomed several decades ago. Parallel with predictive models development, biomarker researches emerged in an impressively great scale. The key question is how best to assess and quantify the improvement in risk prediction offered by new biomarkers or more basically how to assess the performance of a risk prediction model. Discrimination, calibration, and added predictive value have been recently suggested to be used while comparing the predictive performances of the predictive models' with and without novel biomarkers. Lack of user-friendly statistical software has restricted implementation of novel model assessment methods while examining novel biomarkers. We intended, thus, to develop a user-friendly software that could be used by researchers with few programming skills. We have written a Stata command that is intended to help researchers obtain cut point-free and cut point-based net reclassification improvement index and (NRI) and relative and absolute Integrated discriminatory improvement index (IDI) for logistic-based regression analyses.We applied the commands to a real data on women participating the Tehran lipid and glucose study (TLGS) to examine if information of a family history of premature CVD, waist circumference, and fasting plasma glucose can improve predictive performance of the Framingham's "general CVD risk" algorithm. The command is addpred for logistic regression models. The Stata package provided herein can encourage the use of novel methods in examining predictive capacity of ever-emerging plethora of novel biomarkers.

  6. Behavior of high-performance concrete in structural applications.

    DOT National Transportation Integrated Search

    2007-10-01

    High Performance Concrete (HPC) with improved properties has been developed by obtaining the maximum density of the matrix. Mathematical models developed by J.E. Funk and D.R. Dinger, are used to determine the particle size distribution to achieve th...

  7. Establishing an Explanatory Model for Mathematics Identity.

    PubMed

    Cribbs, Jennifer D; Hazari, Zahra; Sonnert, Gerhard; Sadler, Philip M

    2015-04-01

    This article empirically tests a previously developed theoretical framework for mathematics identity based on students' beliefs. The study employs data from more than 9,000 college calculus students across the United States to build a robust structural equation model. While it is generally thought that students' beliefs about their own competence in mathematics directly impact their identity as a "math person," findings indicate that students' self-perceptions related to competence and performance have an indirect effect on their mathematics identity, primarily by association with students' interest and external recognition in mathematics. Thus, the model indicates that students' competence and performance beliefs are not sufficient for their mathematics identity development, and it highlights the roles of interest and recognition. © 2015 The Authors. Child Development © 2015 Society for Research in Child Development, Inc.

  8. Effects of two types of intra-team feedback on developing a shared mental model in Command & Control teams.

    PubMed

    Rasker, P C; Post, W M; Schraagen, J M

    2000-08-01

    In two studies, the effect of two types of intra-team feedback on developing a shared mental model in Command & Control teams was investigated. A distinction is made between performance monitoring and team self-correction. Performance monitoring is the ability of team members to monitor each other's task execution and give feedback during task execution. Team self-correction is the process in which team members engage in evaluating their performance and in determining their strategies after task execution. In two experiments the opportunity to engage in performance monitoring, respectively team self-correction, was varied systematically. Both performance monitoring as well as team self-correction appeared beneficial in the improvement of team performance. Teams that had the opportunity to engage in performance monitoring, however, performed better than teams that had the opportunity to engage in team self-correction.

  9. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  10. Performance Dependences of Multiplication Layer Thickness for InP/InGaAs Avalanche Photodiodes Based on Time Domain Modeling

    NASA Technical Reports Server (NTRS)

    Xiao, Yegao; Bhat, Ishwara; Abedin, M. Nurul

    2005-01-01

    InP/InGaAs avalanche photodiodes (APDs) are being widely utilized in optical receivers for modern long haul and high bit-rate optical fiber communication systems. The separate absorption, grading, charge, and multiplication (SAGCM) structure is an important design consideration for APDs with high performance characteristics. Time domain modeling techniques have been previously developed to provide better understanding and optimize design issues by saving time and cost for the APD research and development. In this work, performance dependences on multiplication layer thickness have been investigated by time domain modeling. These performance characteristics include breakdown field and breakdown voltage, multiplication gain, excess noise factor, frequency response and bandwidth etc. The simulations are performed versus various multiplication layer thicknesses with certain fixed values for the areal charge sheet density whereas the values for the other structure and material parameters are kept unchanged. The frequency response is obtained from the impulse response by fast Fourier transformation. The modeling results are presented and discussed, and design considerations, especially for high speed operation at 10 Gbit/s, are further analyzed.

  11. Anion exchange membrane fuel cell modelling

    NASA Astrophysics Data System (ADS)

    Fragiacomo, P.; Astorino, E.; Chippari, G.; De Lorenzo, G.; Czarnetzki, W. T.; Schneider, W.

    2018-04-01

    A parametric model predicting the performance of a solid polymer electrolyte, anion exchange membrane fuel cell (AEMFC), has been developed, in Matlab environment, based on interrelated electrical and thermal models. The electrical model proposed is developed by modelling an AEMFC open-circuit output voltage, irreversible voltage losses along with a mass balance, while the thermal model is based on the energy balance. The proposed model of the AEMFC stack estimates its dynamic behaviour, in particular the operating temperature variation for different discharge current values. The results of the theoretical fuel cell (FC) stack are reported and analysed in order to highlight the FC performance and how it varies by changing the values of some parameters such as temperature and pressure. Both the electrical and thermal FC models were validated by comparing the model results with experimental data and the results of other models found in the literature.

  12. A model for sequential decoding overflow due to a noisy carrier reference. [communication performance prediction

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1974-01-01

    An approximate analysis of the effect of a noisy carrier reference on the performance of sequential decoding is presented. The analysis uses previously developed techniques for evaluating noisy reference performance for medium-rate uncoded communications adapted to sequential decoding for data rates of 8 to 2048 bits/s. In estimating the ten to the minus fourth power deletion probability thresholds for Helios, the model agrees with experimental data to within the experimental tolerances. The computational problem involved in sequential decoding, carrier loop effects, the main characteristics of the medium-rate model, modeled decoding performance, and perspectives on future work are discussed.

  13. DoD Product Line Practice Workshop Report

    DTIC Science & Technology

    1998-05-01

    capability. The essential enterprise management practices include ensuring sound business goals providing an appropriate funding model performing...business. This way requires vision and explicit support at the organizational level. There must be an explicit funding model to support the development...the same group seems to work best in smaller organizations. A funding model for core asset development also needs to be developed because the core

  14. Conceptual design and analysis of a dynamic scale model of the Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Davis, D. A.; Gronet, M. J.; Tan, M. K.; Thorne, J.

    1994-01-01

    This report documents the conceptual design study performed to evaluate design options for a subscale dynamic test model which could be used to investigate the expected on-orbit structural dynamic characteristics of the Space Station Freedom early build configurations. The baseline option was a 'near-replica' model of the SSF SC-7 pre-integrated truss configuration. The approach used to develop conceptual design options involved three sets of studies: evaluation of the full-scale design and analysis databases, conducting scale factor trade studies, and performing design sensitivity studies. The scale factor trade study was conducted to develop a fundamental understanding of the key scaling parameters that drive design, performance and cost of a SSF dynamic scale model. Four scale model options were estimated: 1/4, 1/5, 1/7, and 1/10 scale. Prototype hardware was fabricated to assess producibility issues. Based on the results of the study, a 1/4-scale size is recommended based on the increased model fidelity associated with a larger scale factor. A design sensitivity study was performed to identify critical hardware component properties that drive dynamic performance. A total of 118 component properties were identified which require high-fidelity replication. Lower fidelity dynamic similarity scaling can be used for non-critical components.

  15. Development of an ideal observer that incorporates nuisance parameters and processes list-mode data

    DOE PAGES

    MacGahan, Christopher Jonathan; Kupinski, Matthew Alan; Hilton, Nathan R.; ...

    2016-02-01

    Observer models were developed to process data in list-mode format in order to perform binary discrimination tasks for use in an arms-control-treaty context. Data used in this study was generated using GEANT4 Monte Carlo simulations for photons using custom models of plutonium inspection objects and a radiation imaging system. We evaluated observer model performance and then presented using the area under the receiver operating characteristic curve. Lastly, we studied the ideal observer under both signal-known-exactly conditions and in the presence of unknowns such as object orientation and absolute count-rate variability; when these additional sources of randomness were present, their incorporationmore » into the observer yielded superior performance.« less

  16. Challenges of developing a cardiovascular risk calculator for patients with rheumatoid arthritis.

    PubMed

    Crowson, Cynthia S; Rollefstad, Silvia; Kitas, George D; van Riel, Piet L C M; Gabriel, Sherine E; Semb, Anne Grete

    2017-01-01

    Cardiovascular disease (CVD) risk calculators designed for use in the general population do not accurately predict the risk of CVD among patients with rheumatoid arthritis (RA), who are at increased risk of CVD. The process of developing risk prediction models involves numerous issues. Our goal was to develop a CVD risk calculator for patients with RA. Thirteen cohorts of patients with RA originating from 10 different countries (UK, Norway, Netherlands, USA, Sweden, Greece, South Africa, Spain, Canada and Mexico) were combined. CVD risk factors and RA characteristics at baseline, in addition to information on CVD outcomes were collected. Cox models were used to develop a CVD risk calculator, considering traditional CVD risk factors and RA characteristics. Model performance was assessed using measures of discrimination and calibration with 10-fold cross-validation. A total of 5638 RA patients without prior CVD were included (mean age: 55 [SD: 14] years, 76% female). During a mean follow-up of 5.8 years (30139 person years), 389 patients developed a CVD event. Event rates varied between cohorts, necessitating inclusion of high and low risk strata in the models. The multivariable analyses revealed 2 risk prediction models including either a disease activity score including a 28 joint count and erythrocyte sedimentation rate (DAS28ESR) or a health assessment questionnaire (HAQ) along with age, sex, presence of hypertension, current smoking and ratio of total cholesterol to high-density lipoprotein cholesterol. Unfortunately, performance of these models was similar to general population CVD risk calculators. Efforts to develop a specific CVD risk calculator for patients with RA yielded 2 potential models including RA disease characteristics, but neither demonstrated improved performance compared to risk calculators designed for use in the general population. Challenges encountered and lessons learned are discussed in detail.

  17. Orion Active Thermal Control System Dynamic Modeling Using Simulink/MATLAB

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Yuko, James

    2010-01-01

    This paper presents dynamic modeling of the crew exploration vehicle (Orion) active thermal control system (ATCS) using Simulink (Simulink, developed by The MathWorks). The model includes major components in ATCS, such as heat exchangers and radiator panels. The mathematical models of the heat exchanger and radiator are described first. Four different orbits were used to validate the radiator model. The current model results were compared with an independent Thermal Desktop (TD) (Thermal Desktop, PC/CAD-based thermal model builder, developed in Cullimore & Ring (C&R) Technologies) model results and showed good agreement for all orbits. In addition, the Orion ATCS performance was presented for three orbits and the current model results were compared with three sets of solutions- FloCAD (FloCAD, PC/CAD-based thermal/fluid model builder, developed in C&R Technologies) model results, SINDA/FLUINT (SINDA/FLUINT, a generalized thermal/fluid network-style solver ) model results, and independent Simulink model results. For each case, the fluid temperatures at every component on both the crew module and service module sides were plotted and compared. The overall agreement is reasonable for all orbits, with similar behavior and trends for the system. Some discrepancies exist because the control algorithm might vary from model to model. Finally, the ATCS performance for a 45-hr nominal mission timeline was simulated to demonstrate the capability of the model. The results show that the ATCS performs as expected and approximately 2.3 lb water was consumed in the sublimator within the 45 hr timeline before Orion docked at the International Space Station.

  18. Performance evaluation of image denoising developed using convolutional denoising autoencoders in chest radiography

    NASA Astrophysics Data System (ADS)

    Lee, Donghoon; Choi, Sunghoon; Kim, Hee-Joung

    2018-03-01

    When processing medical images, image denoising is an important pre-processing step. Various image denoising algorithms have been developed in the past few decades. Recently, image denoising using the deep learning method has shown excellent performance compared to conventional image denoising algorithms. In this study, we introduce an image denoising technique based on a convolutional denoising autoencoder (CDAE) and evaluate clinical applications by comparing existing image denoising algorithms. We train the proposed CDAE model using 3000 chest radiograms training data. To evaluate the performance of the developed CDAE model, we compare it with conventional denoising algorithms including median filter, total variation (TV) minimization, and non-local mean (NLM) algorithms. Furthermore, to verify the clinical effectiveness of the developed denoising model with CDAE, we investigate the performance of the developed denoising algorithm on chest radiograms acquired from real patients. The results demonstrate that the proposed denoising algorithm developed using CDAE achieves a superior noise-reduction effect in chest radiograms compared to TV minimization and NLM algorithms, which are state-of-the-art algorithms for image noise reduction. For example, the peak signal-to-noise ratio and structure similarity index measure of CDAE were at least 10% higher compared to conventional denoising algorithms. In conclusion, the image denoising algorithm developed using CDAE effectively eliminated noise without loss of information on anatomical structures in chest radiograms. It is expected that the proposed denoising algorithm developed using CDAE will be effective for medical images with microscopic anatomical structures, such as terminal bronchioles.

  19. MoDOT pavement preservation research program volume III, development of pavement family and treatment performance models.

    DOT National Transportation Integrated Search

    2015-10-01

    Pavement performance models describe the deterioration behavior of pavements. They are essential in a pavement management : system if the goal is to make more objective, reliable, and cost-effective decisions regarding the timing and nature of paveme...

  20. Analytic Methods for Adjusting Subjective Rating Schemes.

    ERIC Educational Resources Information Center

    Cooper, Richard V. L.; Nelson, Gary R.

    Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…

  1. Initial development of prototype performance model for highway design

    DOT National Transportation Integrated Search

    1997-12-01

    The Federal Highway Administration (FHWA) has undertaken a multiyear project to develop the Interactive Highway Safety Design Model (IHSDM), which is a CADD-based integrated set of software tools to analyze a highway design to identify safety issues ...

  2. Nonlinear stability and control study of highly maneuverable high performance aircraft

    NASA Technical Reports Server (NTRS)

    Mohler, R. R.

    1993-01-01

    This project is intended to research and develop new nonlinear methodologies for the control and stability analysis of high-performance, high angle-of-attack aircraft such as HARV (F18). Past research (reported in our Phase 1, 2, and 3 progress reports) is summarized and more details of final Phase 3 research is provided. While research emphasis is on nonlinear control, other tasks such as associated model development, system identification, stability analysis, and simulation are performed in some detail as well. An overview of various models that were investigated for different purposes such as an approximate model reference for control adaptation, as well as another model for accurate rigid-body longitudinal motion is provided. Only a very cursory analysis was made relative to type 8 (flexible body dynamics). Standard nonlinear longitudinal airframe dynamics (type 7) with the available modified F18 stability derivatives, thrust vectoring, actuator dynamics, and control constraints are utilized for simulated flight evaluation of derived controller performance in all cases studied.

  3. Spectrum Sharing in an ISM Band: Outage Performance of a Hybrid DS/FH Spread Spectrum System with Beamforming

    NASA Astrophysics Data System (ADS)

    Li, Hanyu; Syed, Mubashir; Yao, Yu-Dong; Kamakaris, Theodoros

    2009-12-01

    This paper investigates spectrum sharing issues in the unlicensed industrial, scientific, and medical (ISM) bands. It presents a radio frequency measurement setup and measurement results in 2.4 GHz. It then develops an analytical model to characterize the coexistence interference in the ISM bands, based on radio frequency measurement results in the 2.4 GHz. Outage performance using the interference model is examined for a hybrid direct-sequence frequency-hopping spread spectrum system. The utilization of beamforming techniques in the system is also investigated, and a simplified beamforming model is proposed to analyze the system performance using beamforming. Numerical results show that beamforming significantly improves the system outage performance. The work presented in this paper provides a quantitative evaluation of signal outages in a spectrum sharing environment. It can be used as a tool in the development process for future dynamic spectrum access models as well as engineering designs for applications in unlicensed bands.

  4. A wearable computing platform for developing cloud-based machine learning models for health monitoring applications.

    PubMed

    Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh

    2016-08-01

    Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.

  5. A Framework for Human Performance Criteria for Advanced Reactor Operational Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques V Hugo; David I Gertman; Jeffrey C Joe

    2014-08-01

    This report supports the determination of new Operational Concept models needed in support of the operational design of new reactors. The objective of this research is to establish the technical bases for human performance and human performance criteria frameworks, models, and guidance for operational concepts for advanced reactor designs. The report includes a discussion of operating principles for advanced reactors, the human performance issues and requirements for human performance based upon work domain analysis and current regulatory requirements, and a description of general human performance criteria. The major findings and key observations to date are that there is some operatingmore » experience that informs operational concepts for baseline designs for SFR and HGTRs, with the Experimental Breeder Reactor-II (EBR-II) as a best-case predecessor design. This report summarizes the theoretical and operational foundations for the development of a framework and model for human performance criteria that will influence the development of future Operational Concepts. The report also highlights issues associated with advanced reactor design and clarifies and codifies the identified aspects of technology and operating scenarios.« less

  6. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    NASA Astrophysics Data System (ADS)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  7. Multi-objective models of waste load allocation toward a sustainable reuse of drainage water in irrigation.

    PubMed

    Allam, Ayman; Tawfik, Ahmed; Yoshimura, Chihiro; Fleifle, Amr

    2016-06-01

    The present study proposes a waste load allocation (WLA) framework for a sustainable quality management of agricultural drainage water (ADW). Two multi-objective models, namely, abatement-performance and abatement-equity-performance, were developed through the integration of a water quality model (QAUL2Kw) and a genetic algorithm, by considering (1) the total waste load abatement, and (2) the inequity among waste dischargers. For successfully accomplishing modeling tasks, we developed a comprehensive overall performance measure (E wla ) reflecting possible violations of Egyptian standards for ADW reuse in irrigation. This methodology was applied to the Gharbia drain in the Nile Delta, Egypt, during both summer and winter seasons of 2012. Abatement-performance modeling results for a target of E wla = 100 % corresponded to the abatement ratio of the dischargers ranging from 20.7 to 75.6 % and 29.5 to 78.5 % in summer and in winter, respectively, alongside highly shifting inequity values. Abatement-equity-performance modeling results for a target of E wla = 90 % unraveled the necessity of increasing treatment efforts in three out of five dischargers during summer, and four out of five in winter. The trade-off curves obtained from WLA models proved their reliability in selecting appropriate WLA procedures as a function of budget constraints, principles of social equity, and desired overall performance level. Hence, the proposed framework of methodologies is of great importance to decision makers working toward a sustainable reuse of the ADW in irrigation.

  8. Self-reconfigurable ship fluid-network modeling for simulation-based design

    NASA Astrophysics Data System (ADS)

    Moon, Kyungjin

    Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.

  9. Cognition-Based and Affect-Based Trust as Mediators of Leader Behavior Influences on Team Performance

    ERIC Educational Resources Information Center

    Schaubroeck, John; Lam, Simon S. K.; Peng, Ann Chunyan

    2011-01-01

    We develop a model in which cognitive and affective trust in the leader mediate the relationship between leader behavior and team psychological states that, in turn, drive team performance. The model is tested on a sample of 191 financial services teams in Hong Kong and the U.S. Servant leadership influenced team performance through affect-based…

  10. Predicting Student Academic Performance in an Engineering Dynamics Course: A Comparison of Four Types of Predictive Mathematical Models

    ERIC Educational Resources Information Center

    Huang, Shaobo; Fang, Ning

    2013-01-01

    Predicting student academic performance has long been an important research topic in many academic disciplines. The present study is the first study that develops and compares four types of mathematical models to predict student academic performance in engineering dynamics--a high-enrollment, high-impact, and core course that many engineering…

  11. Application of Cognitive Apprenticeship Model to a Graduate Course in Performance Systems Analysis: A Case Study

    ERIC Educational Resources Information Center

    Darabi, A. Aubteen

    2005-01-01

    This article reports a case study describing how the principles of a cognitive apprenticeship (CA) model developed by Collins, Brown, and Holum (1991) were applied to a graduate course on performance systems analysis (PSA), and the differences this application made in student performance and evaluation of the course compared to the previous…

  12. A stirling engine computer model for performance calculations

    NASA Technical Reports Server (NTRS)

    Tew, R.; Jefferies, K.; Miao, D.

    1978-01-01

    To support the development of the Stirling engine as a possible alternative to the automobile spark-ignition engine, the thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer. The modeling techniques used are presented. The performance of an existing rhombic-drive Stirling engine was simulated by use of this computer program, and some typical results are presented. Engine tests are planned in order to evaluate this model.

  13. System model development for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Walton, James T.; Hannan, Nelson A.; Perkins, Ken R.; Buksa, John H.; Worley, Brian A.; Dobranich, Dean

    1992-01-01

    A critical enabling technology in the evolutionary development of nuclear thermal propulsion (NTP) is the ability to predict the system performance under a variety of operating conditions. This is crucial for mission analysis and for control subsystem testing as well as for the modeling of various failure modes. Performance must be accurately predicted during steady-state and transient operation, including startup, shutdown, and post operation cooling. The development and application of verified and validated system models has the potential to reduce the design, testing, and cost and time required for the technology to reach flight-ready status. Since Oct. 1991, the U.S. Department of Energy (DOE), Department of Defense (DOD), and NASA have initiated critical technology development efforts for NTP systems to be used on Space Exploration Initiative (SEI) missions to the Moon and Mars. This paper presents the strategy and progress of an interagency NASA/DOE/DOD team for NTP system modeling. It is the intent of the interagency team to develop several levels of computer programs to simulate various NTP systems. The first level will provide rapid, parameterized calculations of overall system performance. Succeeding computer programs will provide analysis of each component in sufficient detail to guide the design teams and experimental efforts. The computer programs will allow simulation of the entire system to allow prediction of the integrated performance. An interagency team was formed for this task to use the best capabilities available and to assure appropriate peer review.

  14. A Novel Approach to Develop the Lower Order Model of Multi-Input Multi-Output System

    NASA Astrophysics Data System (ADS)

    Rajalakshmy, P.; Dharmalingam, S.; Jayakumar, J.

    2017-10-01

    A mathematical model is a virtual entity that uses mathematical language to describe the behavior of a system. Mathematical models are used particularly in the natural sciences and engineering disciplines like physics, biology, and electrical engineering as well as in the social sciences like economics, sociology and political science. Physicists, Engineers, Computer scientists, and Economists use mathematical models most extensively. With the advent of high performance processors and advanced mathematical computations, it is possible to develop high performing simulators for complicated Multi Input Multi Ouptut (MIMO) systems like Quadruple tank systems, Aircrafts, Boilers etc. This paper presents the development of the mathematical model of a 500 MW utility boiler which is a highly complex system. A synergistic combination of operational experience, system identification and lower order modeling philosophy has been effectively used to develop a simplified but accurate model of a circulation system of a utility boiler which is a MIMO system. The results obtained are found to be in good agreement with the physics of the process and with the results obtained through design procedure. The model obtained can be directly used for control system studies and to realize hardware simulators for boiler testing and operator training.

  15. Overview of Iodine Propellant Hall Thruster Development Activities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hani; Benavides, Gabriel; Haag, Thomas; Hickman, Tyler; Smith, Timothy; Williams, George; Myers, James; Polzin, Kurt; Dankanich, John; Byrne, Larry; hide

    2016-01-01

    NASA is continuing to invest in advancing Hall thruster technologies for implementation in commercial and government missions. There have been several recent iodine Hall propulsion system development activities performed by the team of the NASA Glenn Research Center, the NASA Marshall Space Flight Center, and Busek Co. Inc. In particular, the work focused on qualification of the Busek BHT-200-I, 200 W and the continued development of the BHT-600-I Hall thruster propulsion systems. This presentation presents an overview of these development activities and also reports on the results of short duration tests that were performed on the engineering model BHT-200-I and the development model BHT-600-I Hall thrusters.

  16. A Novel Hybrid Classification Model of Genetic Algorithms, Modified k-Nearest Neighbor and Developed Backpropagation Neural Network

    PubMed Central

    Salari, Nader; Shohaimi, Shamarina; Najafi, Farid; Nallappan, Meenakshii; Karishnarajah, Isthrinayagy

    2014-01-01

    Among numerous artificial intelligence approaches, k-Nearest Neighbor algorithms, genetic algorithms, and artificial neural networks are considered as the most common and effective methods in classification problems in numerous studies. In the present study, the results of the implementation of a novel hybrid feature selection-classification model using the above mentioned methods are presented. The purpose is benefitting from the synergies obtained from combining these technologies for the development of classification models. Such a combination creates an opportunity to invest in the strength of each algorithm, and is an approach to make up for their deficiencies. To develop proposed model, with the aim of obtaining the best array of features, first, feature ranking techniques such as the Fisher's discriminant ratio and class separability criteria were used to prioritize features. Second, the obtained results that included arrays of the top-ranked features were used as the initial population of a genetic algorithm to produce optimum arrays of features. Third, using a modified k-Nearest Neighbor method as well as an improved method of backpropagation neural networks, the classification process was advanced based on optimum arrays of the features selected by genetic algorithms. The performance of the proposed model was compared with thirteen well-known classification models based on seven datasets. Furthermore, the statistical analysis was performed using the Friedman test followed by post-hoc tests. The experimental findings indicated that the novel proposed hybrid model resulted in significantly better classification performance compared with all 13 classification methods. Finally, the performance results of the proposed model was benchmarked against the best ones reported as the state-of-the-art classifiers in terms of classification accuracy for the same data sets. The substantial findings of the comprehensive comparative study revealed that performance of the proposed model in terms of classification accuracy is desirable, promising, and competitive to the existing state-of-the-art classification models. PMID:25419659

  17. Comparison of Conceptual and Neural Network Rainfall-Runoff Models

    NASA Astrophysics Data System (ADS)

    Vidyarthi, V. K.; Jain, A.

    2014-12-01

    Rainfall-runoff (RR) model is a key component of any water resource application. There are two types of techniques usually employed for RR modeling: physics based and data-driven techniques. Although the physics based models have been used for operational purposes for a very long time, they provide only reasonable accuracy in modeling and forecasting. On the other hand, the Artificial Neural Networks (ANNs) have been reported to provide superior modeling performance; however, they have not been acceptable by practitioners, decision makers and water resources engineers as operational tools. The ANNs one of the data driven techniques, became popular for efficient modeling of the complex natural systems in the last couple of decades. In this paper, the comparative results for conceptual and ANN models in RR modeling are presented. The conceptual models were developed by the use of rainfall-runoff library (RRL) and genetic algorithm (GA) was used for calibration of these models. Feed-forward neural network model structure trained by Levenberg-Marquardt (LM) training algorithm has been adopted here to develop all the ANN models. The daily rainfall, runoff and various climatic data derived from Bird creek basin, Oklahoma, USA were employed to develop all the models included here. Daily potential evapotranspiration (PET), which was used in conceptual model development, was calculated by the use of Penman equation. The input variables were selected on the basis of correlation analysis. The performance evaluation statistics such as average absolute relative error (AARE), Pearson's correlation coefficient (R) and threshold statistics (TS) were used for assessing the performance of all the models developed here. The results obtained in this study show that the ANN models outperform the conventional conceptual models due to their ability to learn the non-linearity and complexity inherent in data of rainfall-runoff process in a more efficient manner. There is a strong need to carry out such studies to prove the superiority of ANN models over conventional methods in an attempt to make them acceptable by water resources community responsible for the operation of water resources systems.

  18. A high‐resolution global flood hazard model†

    PubMed Central

    Smith, Andrew M.; Bates, Paul D.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-01-01

    Abstract Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data‐scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross‐disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ∼90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high‐resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ∼1 km, mean absolute error in flooded fraction falls to ∼5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2‐D only variant and an independently developed pan‐European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next‐generation global terrain data sets will offer the best prospect for a step‐change improvement in model performance. PMID:27594719

  19. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  20. Representing the effects of alpine grassland vegetation cover on the simulation of soil thermal dynamics by ecosystem models applied to the Qinghai-Tibetan Plateau

    USGS Publications Warehouse

    Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A.D.

    2013-01-01

    Soil surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.

  1. Representing the effects of alpine grassland vegetation cover on the simulation of soil thermal dynamics by ecosystem models applied to the Qinghai-Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A. D.

    2013-07-01

    surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.

  2. Development of an Empirically Based Learning Performances Framework for Third-Grade Students' Model-Based Explanations about Plant Processes

    ERIC Educational Resources Information Center

    Zangori, Laura; Forbes, Cory T.

    2016-01-01

    To develop scientific literacy, elementary students should engage in knowledge building of core concepts through scientific practice (Duschl, Schweingruber, & Schouse, 2007). A core scientific practice is engagement in scientific modeling to build conceptual understanding about discipline-specific concepts. Yet scientific modeling remains…

  3. A Caveat Note on Tuning in the Development of Coupled Climate Models

    NASA Astrophysics Data System (ADS)

    Dommenget, Dietmar; Rezny, Michael

    2018-01-01

    State-of-the-art coupled general circulation models (CGCMs) have substantial errors in their simulations of climate. In particular, these errors can lead to large uncertainties in the simulated climate response (both globally and regionally) to a doubling of CO2. Currently, tuning of the parameterization schemes in CGCMs is a significant part of the developed. It is not clear whether such tuning actually improves models. The tuning process is (in general) neither documented, nor reproducible. Alternative methods such as flux correcting are not used nor is it clear if such methods would perform better. In this study, ensembles of perturbed physics experiments are performed with the Globally Resolved Energy Balance (GREB) model to test the impact of tuning. The work illustrates that tuning has, in average, limited skill given the complexity of the system, the limited computing resources, and the limited observations to optimize parameters. While tuning may improve model performance (such as reproducing observed past climate), it will not get closer to the "true" physics nor will it significantly improve future climate change projections. Tuning will introduce artificial compensating error interactions between submodels that will hamper further model development. In turn, flux corrections do perform well in most, but not all aspects. A main advantage of flux correction is that it is much cheaper, simpler, more transparent, and it does not introduce artificial error interactions between submodels. These GREB model experiments should be considered as a pilot study to motivate further CGCM studies that address the issues of model tuning.

  4. Critical research issues in development of biomathematical models of fatigue and performance.

    PubMed

    Dinges, David F

    2004-03-01

    This article reviews the scientific research needed to ensure the continued development, validation, and operational transition of biomathematical models of fatigue and performance. These models originated from the need to ascertain the formal underlying relationships among sleep and circadian dynamics in the control of alertness and neurobehavioral performance capability. Priority should be given to research that further establishes their basic validity, including the accuracy of the core mathematical formulae and parameters that instantiate the interactions of sleep/wake and circadian processes. Since individuals can differ markedly and reliably in their responses to sleep loss and to countermeasures for it, models must incorporate estimates of these inter-individual differences, and research should identify predictors of them. To ensure models accurately predict recovery of function with sleep of varying durations, dose-response curves for recovery of performance as a function of prior sleep homeostatic load and the number of days of recovery are needed. It is also necessary to establish whether the accuracy of models is affected by using work/rest schedules as surrogates for sleep/wake inputs to models. Given the importance of light as both a circadian entraining agent and an alerting agent, research should determine the extent to which light input could incrementally improve model predictions of performance, especially in persons exposed to night work, jet lag, and prolonged work. Models seek to estimate behavioral capability and/or the relative risk of adverse events in a fatigued state. Research is needed on how best to scale and interpret metrics of behavioral capability, and incorporate factors that amplify or diminish the relationship between model predictions of performance and risk outcomes.

  5. Linking asphalt binder fatigue to asphalt mixture fatigue performance using viscoelastic continuum damage modeling

    NASA Astrophysics Data System (ADS)

    Safaei, Farinaz; Castorena, Cassie; Kim, Y. Richard

    2016-08-01

    Fatigue cracking is a major form of distress in asphalt pavements. Asphalt binder is the weakest asphalt concrete constituent and, thus, plays a critical role in determining the fatigue resistance of pavements. Therefore, the ability to characterize and model the inherent fatigue performance of an asphalt binder is a necessary first step to design mixtures and pavements that are not susceptible to premature fatigue failure. The simplified viscoelastic continuum damage (S-VECD) model has been used successfully by researchers to predict the damage evolution in asphalt mixtures for various traffic and climatic conditions using limited uniaxial test data. In this study, the S-VECD model, developed for asphalt mixtures, is adapted for asphalt binders tested under cyclic torsion in a dynamic shear rheometer. Derivation of the model framework is presented. The model is verified by producing damage characteristic curves that are both temperature- and loading history-independent based on time sweep tests, given that the effects of plasticity and adhesion loss on the material behavior are minimal. The applicability of the S-VECD model to the accelerated loading that is inherent of the linear amplitude sweep test is demonstrated, which reveals reasonable performance predictions, but with some loss in accuracy compared to time sweep tests due to the confounding effects of nonlinearity imposed by the high strain amplitudes included in the test. The asphalt binder S-VECD model is validated through comparisons to asphalt mixture S-VECD model results derived from cyclic direct tension tests and Accelerated Loading Facility performance tests. The results demonstrate good agreement between the asphalt binder and mixture test results and pavement performance, indicating that the developed model framework is able to capture the asphalt binder's contribution to mixture fatigue and pavement fatigue cracking performance.

  6. Development of an online, publicly accessible naive Bayesian decision support tool for mammographic mass lesions based on the American College of Radiology (ACR) BI-RADS lexicon.

    PubMed

    Benndorf, Matthias; Kotter, Elmar; Langer, Mathias; Herda, Christoph; Wu, Yirong; Burnside, Elizabeth S

    2015-06-01

    To develop and validate a decision support tool for mammographic mass lesions based on a standardized descriptor terminology (BI-RADS lexicon) to reduce variability of practice. We used separate training data (1,276 lesions, 138 malignant) and validation data (1,177 lesions, 175 malignant). We created naïve Bayes (NB) classifiers from the training data with tenfold cross-validation. Our "inclusive model" comprised BI-RADS categories, BI-RADS descriptors, and age as predictive variables; our "descriptor model" comprised BI-RADS descriptors and age. The resulting NB classifiers were applied to the validation data. We evaluated and compared classifier performance with ROC-analysis. In the training data, the inclusive model yields an AUC of 0.959; the descriptor model yields an AUC of 0.910 (P < 0.001). The inclusive model is superior to the clinical performance (BI-RADS categories alone, P < 0.001); the descriptor model performs similarly. When applied to the validation data, the inclusive model yields an AUC of 0.935; the descriptor model yields an AUC of 0.876 (P < 0.001). Again, the inclusive model is superior to the clinical performance (P < 0.001); the descriptor model performs similarly. We consider our classifier a step towards a more uniform interpretation of combinations of BI-RADS descriptors. We provide our classifier at www.ebm-radiology.com/nbmm/index.html . • We provide a decision support tool for mammographic masses at www.ebm-radiology.com/nbmm/index.html . • Our tool may reduce variability of practice in BI-RADS category assignment. • A formal analysis of BI-RADS descriptors may enhance radiologists' diagnostic performance.

  7. ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya; Rosenberg, Michael I.

    This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less

  8. Image processing, geometric modeling and data management for development of a virtual bone surgery system.

    PubMed

    Niu, Qiang; Chi, Xiaoyi; Leu, Ming C; Ochoa, Jorge

    2008-01-01

    This paper describes image processing, geometric modeling and data management techniques for the development of a virtual bone surgery system. Image segmentation is used to divide CT scan data into different segments representing various regions of the bone. A region-growing algorithm is used to extract cortical bone and trabecular bone structures systematically and efficiently. Volume modeling is then used to represent the bone geometry based on the CT scan data. Material removal simulation is achieved by continuously performing Boolean subtraction of the surgical tool model from the bone model. A quadtree-based adaptive subdivision technique is developed to handle the large set of data in order to achieve the real-time simulation and visualization required for virtual bone surgery. A Marching Cubes algorithm is used to generate polygonal faces from the volumetric data. Rendering of the generated polygons is performed with the publicly available VTK (Visualization Tool Kit) software. Implementation of the developed techniques consists of developing a virtual bone-drilling software program, which allows the user to manipulate a virtual drill to make holes with the use of a PHANToM device on a bone model derived from real CT scan data.

  9. Development of index based pavement performance models for pavement management system (PMS) of LADOTD : tech summary.

    DOT National Transportation Integrated Search

    2009-03-01

    A research study was initiated by the Louisiana Department of Transportation and Development (LADOTD) in conjunction with the : Federal Highway Administration (FHWA) to evaluate the overall performance and eff ectiveness of LADOTDs Pavement Manage...

  10. Development of safety performance functions for North Carolina.

    DOT National Transportation Integrated Search

    2011-12-06

    "The objective of this effort is to develop safety performance functions (SPFs) for different types of facilities in North Carolina : and illustrate how they can be used to improve the decision making process. The prediction models in Part C of the H...

  11. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  12. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  13. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  15. Cognitive Components Underpinning the Development of Model-Based Learning

    PubMed Central

    Potter, Tracey C.S.; Bryce, Nessa V.; Hartley, Catherine A.

    2016-01-01

    Reinforcement learning theory distinguishes “model-free” learning, which fosters reflexive repetition of previously rewarded actions, from “model-based” learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9–25, we examined whether the abilities to infer sequential regularities in the environment (“statistical learning”), maintain information in an active state (“working memory”) and integrate distant concepts to solve problems (“fluid reasoning”) predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. PMID:27825732

  16. Cognitive components underpinning the development of model-based learning.

    PubMed

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. The Use of a Block Diagram Simulation Language for Rapid Model Prototyping

    NASA Technical Reports Server (NTRS)

    Whitlow, Johnathan E.; Engrand, Peter

    1996-01-01

    The research performed this summer was a continuation of work performed during the 1995 NASA/ASEE Summer Fellowship. The focus of the work was to expand previously generated predictive models for liquid oxygen (LOX) loading into the external fuel tank of the shuttle. The models which were developed using a block diagram simulation language known as VisSim, were evaluated on numerous shuttle flights and found to well in most cases. Once the models were refined and validated, the predictive methods were integrated into the existing Rockwell software propulsion advisory tool (PAT). Although time was not sufficient to completely integrate the models developed into PAT, the ability to predict flows and pressures in the orbiter section and graphically display the results was accomplished.

  18. SAM Technical Review Committee Final Report: Summary and Key Recommendations from the Onsite TRC Meeting Held April 22-23, 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, N.; Dobos, S.; Janzou, S.

    2013-08-01

    The System Advisor Model (SAM) is a broad and robust set of models and frameworks for analyzing both system performance and system financing. It does this across a range of technologies dominated by solar technologies including photovoltaics (PV) and concentrated solar power (CSP). The U.S. Department of Energy (DOE) Solar Energy Technology Program requested the SAM development team to review the photovoltaic performance modeling with the development community and specifically, with the independent engineering community. The report summarizes the major effort for this technical review committee (TRC).

  19. High-performance heat pipes for heat recovery applications

    NASA Technical Reports Server (NTRS)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  20. Composite panel development at JPL

    NASA Technical Reports Server (NTRS)

    Mcelroy, Paul; Helms, Rich

    1988-01-01

    Parametric computer studies can be use in a cost effective manner to determine optimized composite mirror panel designs. An InterDisciplinary computer Model (IDM) was created to aid in the development of high precision reflector panels for LDR. The materials properties, thermal responses, structural geometries, and radio/optical precision are synergistically analyzed for specific panel designs. Promising panels designs are fabricated and tested so that comparison with panel test results can be used to verify performance prediction models and accommodate design refinement. The iterative approach of computer design and model refinement with performance testing and materials optimization has shown good results for LDR panels.

  1. Modelling Pedestrian Travel Time and the Design of Facilities: A Queuing Approach

    PubMed Central

    Rahman, Khalidur; Abdul Ghani, Noraida; Abdulbasah Kamil, Anton; Mustafa, Adli; Kabir Chowdhury, Md. Ahmed

    2013-01-01

    Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055

  2. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  3. Performance Evaluation of the NASA/KSC Transmission System

    NASA Technical Reports Server (NTRS)

    Christensen, Kenneth J.

    2000-01-01

    NASA-KSC currently uses three bridged 100-Mbps FDDI segments as its backbone for data traffic. The FDDI Transmission System (FTXS) connects the KSC industrial area, KSC launch complex 39 area, and the Cape Canaveral Air Force Station. The report presents a performance modeling study of the FTXS and the proposed ATM Transmission System (ATXS). The focus of the study is on performance of MPEG video transmission on these networks. Commercial modeling tools - the CACI Predictor and Comnet tools - were used. In addition, custom software tools were developed to characterize conversation pairs in Sniffer trace (capture) files to use as input to these tools. A baseline study of both non-launch and launch day data traffic on the FTXS is presented. MPEG-1 and MPEG-2 video traffic was characterized and the shaping of it evaluated. It is shown that the characteristics of a video stream has a direct effect on its performance in a network. It is also shown that shaping of video streams is necessary to prevent overflow losses and resulting poor video quality. The developed models can be used to predict when the existing FTXS will 'run out of room' and for optimizing the parameters of ATM links used for transmission of MPEG video. Future work with these models can provide useful input and validation to set-top box projects within the Advanced Networks Development group in NASA-KSC Development Engineering.

  4. Evaluation of MPLM Design and Mission 6A Coupled Loads Analyses

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Ricks, Ed

    1999-01-01

    Through the development of a space shuttle payload, there are usually several coupled loads analyses (CLA) performed: preliminary design, critical design, final design and verification loads analysis (VLA). A final design CLA is the last analysis conducted prior to model delivery to the shuttle program for the VLA. The finite element models used in the final design CLA and the VLA are test verified dynamic math models. Mission 6A is the first of many flights of the Multi-Purpose Logistics Module (MPLM). The MPLM was developed by Alenia Spazio S.p.A. (an Italian aerospace company) and houses the International Standard Payload Racks (ISPR) for transportation to the space station in the shuttle. Marshall Space Flight Center (MSFC), the payload integrator of the MPLM for Mission 6A, performed the final design CLA using the M6.OZC shuttle data for liftoff and landing conditions using the proper shuttle cargo manifest. Alenia performed the preliminary and critical design CLAs for the development of the MPLM. However, these CLAs did not use the current Mission 6A cargo manifest. An evaluation of the preliminary and critical design performed by Alenia and the final design performed by MSFC is presented.

  5. Development of a Systems Engineering Competency Model Tool for the Aviation and Missile Research, Development, And Engineering Center (AMRDEC)

    DTIC Science & Technology

    2017-06-01

    The Naval Postgraduate School has developed a competency model for the systems engineering profession and is implementing a tool to support high...stakes human resource functions for the U.S. Army. A systems engineering career competency model (SECCM), recently developed by the Navy and verified by...the Office of Personnel Management (OPM), defines the critical competencies for successful performance as a systems engineer at each general schedule

  6. Evaluation and Development of Pavement Scores, Performance Models and Needs Estimates for the TXDOT Pavement Management Information System : Final Report

    DOT National Transportation Integrated Search

    2012-10-01

    This project conducted a thorough review of the existing Pavement Management Information System (PMIS) database, : performance models, needs estimates, utility curves, and scores calculations, as well as a review of District practices : concerning th...

  7. On the Fidelity of Semi-distributed Hydrologic Model Simulations for Large Scale Catchment Applications

    NASA Astrophysics Data System (ADS)

    Ajami, H.; Sharma, A.; Lakshmi, V.

    2017-12-01

    Application of semi-distributed hydrologic modeling frameworks is a viable alternative to fully distributed hyper-resolution hydrologic models due to computational efficiency and resolving fine-scale spatial structure of hydrologic fluxes and states. However, fidelity of semi-distributed model simulations is impacted by (1) formulation of hydrologic response units (HRUs), and (2) aggregation of catchment properties for formulating simulation elements. Here, we evaluate the performance of a recently developed Soil Moisture and Runoff simulation Toolkit (SMART) for large catchment scale simulations. In SMART, topologically connected HRUs are delineated using thresholds obtained from topographic and geomorphic analysis of a catchment, and simulation elements are equivalent cross sections (ECS) representative of a hillslope in first order sub-basins. Earlier investigations have shown that formulation of ECSs at the scale of a first order sub-basin reduces computational time significantly without compromising simulation accuracy. However, the implementation of this approach has not been fully explored for catchment scale simulations. To assess SMART performance, we set-up the model over the Little Washita watershed in Oklahoma. Model evaluations using in-situ soil moisture observations show satisfactory model performance. In addition, we evaluated the performance of a number of soil moisture disaggregation schemes recently developed to provide spatially explicit soil moisture outputs at fine scale resolution. Our results illustrate that the statistical disaggregation scheme performs significantly better than the methods based on topographic data. Future work is focused on assessing the performance of SMART using remotely sensed soil moisture observations using spatially based model evaluation metrics.

  8. Performance Comparison of the European Storm Surge Models and Chaotic Model in Forecasting Extreme Storm Surges

    NASA Astrophysics Data System (ADS)

    Siek, M. B.; Solomatine, D. P.

    2009-04-01

    Storm surge modeling has rapidly developed considerably over the past 30 years. A number of significant advances on operational storm surge models have been implemented and tested, consisting of: refining computational grids, calibrating the model, using a better numerical scheme (i.e. more realistic model physics for air-sea interaction), implementing data assimilation and ensemble model forecasts. This paper addresses the performance comparison between the existing European storm surge models and the recently developed methods of nonlinear dynamics and chaos theory in forecasting storm surge dynamics. The chaotic model is built using adaptive local models based on the dynamical neighbours in the reconstructed phase space of observed time series data. The comparison focused on the model accuracy in forecasting a recently extreme storm surge in the North Sea on November 9th, 2007 that hit the coastlines of several European countries. The combination of a high tide, north-westerly winds exceeding 50 mph and low pressure produced an exceptional storm tide. The tidal level was exceeded 3 meters above normal sea levels. Flood warnings were issued for the east coast of Britain and the entire Dutch coast. The Maeslant barrier's two arc-shaped steel doors in the Europe's biggest port of Rotterdam was closed for the first time since its construction in 1997 due to this storm surge. In comparison to the chaotic model performance, the forecast data from several European physically-based storm surge models were provided from: BSH Germany, DMI Denmark, DNMI Norway, KNMI Netherlands and MUMM Belgium. The performance comparison was made over testing datasets for two periods/conditions: non-stormy period (1-Sep-2007 till 14-Oct-2007) and stormy period (15-Oct-2007 till 20-Nov-2007). A scalar chaotic model with optimized parameters was developed by utilizing an hourly training dataset of observations (11-Sep-2005 till 31-Aug-2007). The comparison results indicated the chaotic model yields better forecasts than the existing European storm surge models. The best performance of European storm surge models for non-storm and storm conditions was achieved by KNMI (with Kalman filter data assimilation) and BSH with errors of 8.95cm and 10.92cm, respectively. Whereas the chaotic model can provide 6 and 48 hours forecasts with errors of 3.10cm and 8.55cm for non-storm condition and 5.04cm and 15.21cm for storm condition, respectively. The chaotic model can provide better forecasts primarily due to the fact that the chaotic model forecasting are estimated by local models which model and identify the similar development of storm surges in the past. In practice, the chaotic model can serve as a reliable and accurate model to support decision-makers in operational ship navigation and flood forecasting.

  9. ToxiM: A Toxicity Prediction Tool for Small Molecules Developed Using Machine Learning and Chemoinformatics Approaches.

    PubMed

    Sharma, Ashok K; Srivastava, Gopal N; Roy, Ankita; Sharma, Vineet K

    2017-01-01

    The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84-0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better ( R 2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better ( R 2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules.

  10. ToxiM: A Toxicity Prediction Tool for Small Molecules Developed Using Machine Learning and Chemoinformatics Approaches

    PubMed Central

    Sharma, Ashok K.; Srivastava, Gopal N.; Roy, Ankita; Sharma, Vineet K.

    2017-01-01

    The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84–0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better (R2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better (R2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules. PMID:29249969

  11. Stanford/NASA-Ames Center of Excellence in model-based human performance

    NASA Technical Reports Server (NTRS)

    Wandell, Brian A.

    1990-01-01

    The human operator plays a critical role in many aeronautic and astronautic missions. The Stanford/NASA-Ames Center of Excellence in Model-Based Human Performance (COE) was initiated in 1985 to further our understanding of the performance capabilities and performance limits of the human component of aeronautic and astronautic projects. Support from the COE is devoted to those areas of experimental and theoretical work designed to summarize and explain human performance by developing computable performance models. The ultimate goal is to make these computable models available to other scientists for use in design and evaluation of aeronautic and astronautic instrumentation. Within vision science, two topics have received particular attention. First, researchers did extensive work analyzing the human ability to recognize object color relatively independent of the spectral power distribution of the ambient lighting (color constancy). The COE has supported a number of research papers in this area, as well as the development of a substantial data base of surface reflectance functions, ambient illumination functions, and an associated software package for rendering and analyzing image data with respect to these spectral functions. Second, the COE supported new empirical studies on the problem of selecting colors for visual display equipment to enhance human performance in discrimination and recognition tasks.

  12. Nonlinear control of linear parameter varying systems with applications to hypersonic vehicles

    NASA Astrophysics Data System (ADS)

    Wilcox, Zachary Donald

    The focus of this dissertation is to design a controller for linear parameter varying (LPV) systems, apply it specifically to air-breathing hypersonic vehicles, and examine the interplay between control performance and the structural dynamics design. Specifically a Lyapunov-based continuous robust controller is developed that yields exponential tracking of a reference model, despite the presence of bounded, nonvanishing disturbances. The hypersonic vehicle has time varying parameters, specifically temperature profiles, and its dynamics can be reduced to an LPV system with additive disturbances. Since the HSV can be modeled as an LPV system the proposed control design is directly applicable. The control performance is directly examined through simulations. A wide variety of applications exist that can be effectively modeled as LPV systems. In particular, flight systems have historically been modeled as LPV systems and associated control tools have been applied such as gain-scheduling, linear matrix inequalities (LMIs), linear fractional transformations (LFT), and mu-types. However, as the type of flight environments and trajectories become more demanding, the traditional LPV controllers may no longer be sufficient. In particular, hypersonic flight vehicles (HSVs) present an inherently difficult problem because of the nonlinear aerothermoelastic coupling effects in the dynamics. HSV flight conditions produce temperature variations that can alter both the structural dynamics and flight dynamics. Starting with the full nonlinear dynamics, the aerothermoelastic effects are modeled by a temperature dependent, parameter varying state-space representation with added disturbances. The model includes an uncertain parameter varying state matrix, an uncertain parameter varying non-square (column deficient) input matrix, and an additive bounded disturbance. In this dissertation, a robust dynamic controller is formulated for a uncertain and disturbed LPV system. The developed controller is then applied to a HSV model, and a Lyapunov analysis is used to prove global exponential reference model tracking in the presence of uncertainty in the state and input matrices and exogenous disturbances. Simulations with a spectrum of gains and temperature profiles on the full nonlinear dynamic model of the HSV is used to illustrate the performance and robustness of the developed controller. In addition, this work considers how the performance of the developed controller varies over a wide variety of control gains and temperature profiles and are optimized with respect to different performance metrics. Specifically, various temperature profile models and related nonlinear temperature dependent disturbances are used to characterize the relative control performance and effort for each model. Examining such metrics as a function of temperature provides a potential inroad to examine the interplay between structural/thermal protection design and control development and has application for future HSV design and control implementation.

  13. Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres

    NASA Technical Reports Server (NTRS)

    McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.

    1999-01-01

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.

  14. Modeling the performance of direct-detection Doppler lidar systems including cloud and solar background variability.

    PubMed

    McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D

    1999-10-20

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.

  15. Incorporating uncertainty in predictive species distribution modelling.

    PubMed

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  16. Mathematical model development and simulation of heat pump fruit dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achariyaviriya, S.; Soponronnarit, S.; Terdyothin, A.

    2000-01-01

    A mathematical model of a heat pump fruit dryer was developed to study the performance of heat pump dryers. Using the moisture content of papaya glace drying, the refrigerant temperature at the evaporator and condenser and the performance, was verified. It was found that the simulated results using closed loop heat pump dryer were close to the experimental results. The criteria for evaluating the performance were specific moisture extraction rate and drying rate. The results showed that ambient conditions affected significantly on the performance of the open loop dryer and the partially closed loop dryer. Also, the fraction of evaporatormore » bypass air affected markedly the performance of all heat pump dryers. In addition, it was found that specific air flow rate and drying air temperature affected significantly the performance of all heat pump dryers.« less

  17. An approximate theoretical method for modeling the static thrust performance of non-axisymmetric two-dimensional convergent-divergent nozzles. M.S. Thesis - George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.

    1995-01-01

    An analytical/numerical method has been developed to predict the static thrust performance of non-axisymmetric, two-dimensional convergent-divergent exhaust nozzles. Thermodynamic nozzle performance effects due to over- and underexpansion are modeled using one-dimensional compressible flow theory. Boundary layer development and skin friction losses are calculated using an approximate integral momentum method based on the classic karman-Polhausen solution. Angularity effects are included with these two models in a computational Nozzle Performance Analysis Code, NPAC. In four different case studies, results from NPAC are compared to experimental data obtained from subscale nozzle testing to demonstrate the capabilities and limitations of the NPAC method. In several cases, the NPAC prediction matched experimental gross thrust efficiency data to within 0.1 percent at a design NPR, and to within 0.5 percent at off-design conditions.

  18. Development of Models for Regional Cardiac Surgery Centers

    PubMed Central

    Park, Choon Seon; Park, Nam Hee; Sim, Sung Bo; Yun, Sang Cheol; Ahn, Hye Mi; Kim, Myunghwa; Choi, Ji Suk; Kim, Myo Jeong; Kim, Hyunsu; Chee, Hyun Keun; Oh, Sanggi; Kang, Shinkwang; Lee, Sok-Goo; Shin, Jun Ho; Kim, Keonyeop; Lee, Kun Sei

    2016-01-01

    Background This study aimed to develop the models for regional cardiac surgery centers, which take regional characteristics into consideration, as a policy measure that could alleviate the concentration of cardiac surgery in the metropolitan area and enhance the accessibility for patients who reside in the regions. Methods To develop the models and set standards for the necessary personnel and facilities for the initial management plan, we held workshops, debates, and conference meetings with various experts. Results After partitioning the plan into two parts (the operational autonomy and the functional comprehensiveness), three models were developed: the ‘independent regional cardiac surgery center’ model, the ‘satellite cardiac surgery center within hospitals’ model, and the ‘extended cardiac surgery department within hospitals’ model. Proposals on personnel and facility management for each of the models were also presented. A regional cardiac surgery center model that could be applied to each treatment area was proposed, which was developed based on the anticipated demand for cardiac surgery. The independent model or the satellite model was proposed for Chungcheong, Jeolla, North Gyeongsang, and South Gyeongsang area, where more than 500 cardiac surgeries are performed annually. The extended model was proposed as most effective for the Gangwon and Jeju area, where more than 200 cardiac surgeries are performed annually. Conclusion The operation of regional cardiac surgery centers with high caliber professionals and quality resources such as optimal equipment and facility size, should enhance regional healthcare accessibility and the quality of cardiac surgery in South Korea. PMID:28035295

  19. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  20. Developing a new stochastic competitive model regarding inventory and price

    NASA Astrophysics Data System (ADS)

    Rashid, Reza; Bozorgi-Amiri, Ali; Seyedhoseini, S. M.

    2015-09-01

    Within the competition in today's business environment, the design of supply chains becomes more complex than before. This paper deals with the retailer's location problem when customers choose their vendors, and inventory costs have been considered for retailers. In a competitive location problem, price and location of facilities affect demands of customers; consequently, simultaneous optimization of the location and inventory system is needed. To prepare a realistic model, demand and lead time have been assumed as stochastic parameters, and queuing theory has been used to develop a comprehensive mathematical model. Due to complexity of the problem, a branch and bound algorithm has been developed, and its performance has been validated in several numerical examples, which indicated effectiveness of the algorithm. Also, a real case has been prepared to demonstrate performance of the model for real world.

  1. Commercial absorption chiller models for evaluation of control strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koeppel, E.A.; Klein, S.A.; Mitchell, J.W.

    1995-08-01

    A steady-state computer simulation model of a direct fired double-effect water-lithium bromide absorption chiller in the parallel-flow configuration was developed from first principles. Unknown model parameters such as heat transfer coefficients were determined by matching the model`s calculated state points and coefficient of performance (COP) against nominal full-load operating data and COPs obtained from a manufacturer`s catalog. The model compares favorably with the manufacturer`s performance ratings for varying water circuit (chilled and cooling) temperatures at full load conditions and for chiller part-load performance. The model was used (1) to investigate the effect of varying the water circuit flow rates withmore » the chiller load and (2) to optimize chiller part-load performance with respect to the distribution and flow of the weak solution.« less

  2. Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit

    NASA Astrophysics Data System (ADS)

    Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi

    2017-02-01

    In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.

  3. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    NASA Astrophysics Data System (ADS)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to examine the interrelationships among the identified CSFs, KPIs, and sustainable dimensions of BEER. The findings indicate that the success of sustainable BEER in hotel buildings under the EPC mechanism is mainly decided by project objectives control mechanism, available technology, organizing capacity of team leader, trust among partners, accurate M&V, and team workers' technical skills.

  4. Driver performance modelling and its practical application to railway safety.

    PubMed

    Hamilton, W Ian; Clarke, Theresa

    2005-11-01

    This paper reports on the development and main features of a model of driver information processing. The work was conducted on behalf of Network Rail to meet a requirement to understand and manage the driver's interaction with the infrastructure through lineside reminder appliances. The model utilises cognitive theory and modelling techniques to describe driver performance in relation to infrastructure features and operational conditions. The model is capable of predicting the performance time, workload and error consequences of different operational conditions. The utility of the model is demonstrated through reports of its application to the following studies: Research on the effect of line speed on driver interaction with signals and signs. Calculation of minimum reading times for signals. Development of a human factors signals passed at danger (SPAD) hazard checklist, and a method to resolve conflicts between signal sighting solutions. Research on the demands imposed on drivers by European train control system (ETCS) driving in a UK context. The paper also reports on a validation of the model's utility as a tool for assessing cab and infrastructure drivability.

  5. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  6. Thermal performance - Rangewood Villas. Field monitoring of various conservation construction techniques in the hot-humid area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-06-01

    This report, prepared by researchers at Florida Solar Energy Center, describes data acquired over a complete year of comprehensive thermal performance monitoring. The construction details of the house and instrumentation system are clearly documented. Rangewood Villas in Cocoa, Florida, is an innovative townhouse project that incorporates several energy efficient construction techniques developed at FSEC including vent skin roofs and walls utilizing radiant barriers to substantially lower heat gain through radiant transfer of solar energy. The computer simulation model selected as the basis for data acquisition parameters is the Thermal Analysis Research Program (TARP). The TARP model does not contain humiditymore » correlations which are very important in predicting thermal performance in the warm humid area. These correlations are developed for enhancement of the TARP model through extensive relative humidity measurements in various zones, and enthalpy measurements of the heat pump. The data acquisition system devised for this program provides a standard instrumentation system which can be adapted by others working in the hot humid area and intersted in developing comparative performance data.« less

  7. A model to teach concomitant patient communication during psychomotor skill development.

    PubMed

    Nicholls, Delwyn; Sweet, Linda; Muller, Amanda; Hyett, Jon

    2018-01-01

    Many health professionals use psychomotor or task-based skills in clinical practice that require concomitant communication with a conscious patient. Verbally engaging with the patient requires highly developed verbal communication skills, enabling the delivery of patient-centred care. Historically, priority has been given to learning the psychomotor skills essential to clinical practice. However, there has been a shift towards also ensuring competent communication with the patient during skill performance. While there is literature outlining the steps to teach and learn verbal communication skills, little is known about the most appropriate instructional approach to teach how to verbally engage with the patient when also learning to perform a task. A literature review was performed and it identified that there was no model or proven approach which could be used to integrate the learning of both psychomotor and communication skills. This paper reviews the steps to teach a communication skill and provides a suggested model to guide the acquisition and development of the concomitant -communication skills required with a patient at the time a psychomotor skill is performed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. The COPERNIC3 project: how AREVA is successfully developing an advanced global fuel rod performance code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garnier, Ch.; Mailhe, P.; Sontheimer, F.

    2007-07-01

    Fuel performance is a key factor for minimizing operating costs in nuclear plants. One of the important aspects of fuel performance is fuel rod design, based upon reliable tools able to verify the safety of current fuel solutions, prevent potential issues in new core managements and guide the invention of tomorrow's fuels. AREVA is developing its future global fuel rod code COPERNIC3, which is able to calculate the thermal-mechanical behavior of advanced fuel rods in nuclear plants. Some of the best practices to achieve this goal are described, by reviewing the three pillars of a fuel rod code: the database,more » the modelling and the computer and numerical aspects. At first, the COPERNIC3 database content is described, accompanied by the tools developed to effectively exploit the data. Then is given an overview of the main modelling aspects, by emphasizing the thermal, fission gas release and mechanical sub-models. In the last part, numerical solutions are detailed in order to increase the computational performance of the code, with a presentation of software configuration management solutions. (authors)« less

  9. The development of an integrated Indonesian health care model using Kano's model, quality function deployment and balanced scorecard

    NASA Astrophysics Data System (ADS)

    Jonny, Zagloed, Teuku Yuri M.

    2017-11-01

    This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).

  10. Performance and state-space analyses of systems using Petri nets

    NASA Technical Reports Server (NTRS)

    Watson, James Francis, III

    1992-01-01

    The goal of any modeling methodology is to develop a mathematical description of a system that is accurate in its representation and also permits analysis of structural and/or performance properties. Inherently, trade-offs exist between the level detail in the model and the ease with which analysis can be performed. Petri nets (PN's), a highly graphical modeling methodology for Discrete Event Dynamic Systems, permit representation of shared resources, finite capacities, conflict, synchronization, concurrency, and timing between state changes. By restricting the state transition time delays to the family of exponential density functions, Markov chain analysis of performance problems is possible. One major drawback of PN's is the tendency for the state-space to grow rapidly (exponential complexity) compared to increases in the PN constructs. It is the state space, or the Markov chain obtained from it, that is needed in the solution of many problems. The theory of state-space size estimation for PN's is introduced. The problem of state-space size estimation is defined, its complexities are examined, and estimation algorithms are developed. Both top-down and bottom-up approaches are pursued, and the advantages and disadvantages of each are described. Additionally, the author's research in non-exponential transition modeling for PN's is discussed. An algorithm for approximating non-exponential transitions is developed. Since only basic PN constructs are used in the approximation, theory already developed for PN's remains applicable. Comparison to results from entropy theory show the transition performance is close to the theoretic optimum. Inclusion of non-exponential transition approximations improves performance results at the expense of increased state-space size. The state-space size estimation theory provides insight and algorithms for evaluating this trade-off.

  11. Modeling the target acquisition performance of active imaging systems

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Jacobs, Eddie L.; Halford, Carl E.; Vollmerhausen, Richard; Tofsted, David H.

    2007-04-01

    Recent development of active imaging system technology in the defense and security community have driven the need for a theoretical understanding of its operation and performance in military applications such as target acquisition. In this paper, the modeling of active imaging systems, developed at the U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate, is presented with particular emphasis on the impact of coherent effects such as speckle and atmospheric scintillation. Experimental results from human perception tests are in good agreement with the model results, validating the modeling of coherent effects as additional noise sources. Example trade studies on the design of a conceptual active imaging system to mitigate deleterious coherent effects are shown.

  12. Modeling the target acquisition performance of active imaging systems.

    PubMed

    Espinola, Richard L; Jacobs, Eddie L; Halford, Carl E; Vollmerhausen, Richard; Tofsted, David H

    2007-04-02

    Recent development of active imaging system technology in the defense and security community have driven the need for a theoretical understanding of its operation and performance in military applications such as target acquisition. In this paper, the modeling of active imaging systems, developed at the U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate, is presented with particular emphasis on the impact of coherent effects such as speckle and atmospheric scintillation. Experimental results from human perception tests are in good agreement with the model results, validating the modeling of coherent effects as additional noise sources. Example trade studies on the design of a conceptual active imaging system to mitigate deleterious coherent effects are shown.

  13. Modeling and simulation of queuing system for customer service improvement: A case study

    NASA Astrophysics Data System (ADS)

    Xian, Tan Chai; Hong, Chai Weng; Hawari, Nurul Nazihah

    2016-10-01

    This study aims to develop a queuing model at UniMall by using discrete event simulation approach in analyzing the service performance that affects customer satisfaction. The performance measures that considered in this model are such as the average time in system, the total number of student served, the number of student in waiting queue, the waiting time in queue as well as the maximum length of buffer. ARENA simulation software is used to develop a simulation model and the output is analyzed. Based on the analysis of output, it is recommended that management of UniMall consider introducing shifts and adding another payment counter in the morning.

  14. Analysis, testing, and evaluation of faulted and unfaulted Wye, Delta, and open Delta connected electromechanical actuators

    NASA Technical Reports Server (NTRS)

    Nehl, T. W.; Demerdash, N. A.

    1983-01-01

    Mathematical models capable of simulating the transient, steady state, and faulted performance characteristics of various brushless dc machine-PSA (power switching assembly) configurations were developed. These systems are intended for possible future use as primemovers in EMAs (electromechanical actuators) for flight control applications. These machine-PSA configurations include wye, delta, and open-delta connected systems. The research performed under this contract was initially broken down into the following six tasks: development of mathematical models for various machine-PSA configurations; experimental validation of the model for failure modes; experimental validation of the mathematical model for shorted turn-failure modes; tradeoff study; and documentation of results and methodology.

  15. Computation of Turbulent Wake Flows in Variable Pressure Gradient

    NASA Technical Reports Server (NTRS)

    Duquesne, N.; Carlson, J. R.; Rumsey, C. L.; Gatski, T. B.

    1999-01-01

    Transport aircraft performance is strongly influenced by the effectiveness of high-lift systems. Developing wakes generated by the airfoil elements are subjected to strong pressure gradients and can thicken very rapidly, limiting maximum lift. This paper focuses on the effects of various pressure gradients on developing symmetric wakes and on the ability of a linear eddy viscosity model and a non-linear explicit algebraic stress model to accurately predict their downstream evolution. In order to reduce the uncertainties arising from numerical issues when assessing the performance of turbulence models, three different numerical codes with the same turbulence models are used. Results are compared to available experimental data to assess the accuracy of the computational results.

  16. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE PAGES

    Chassin, David P.; Rondeau, Daniel

    2016-08-24

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  17. Flamelet Model Application for Non-Premixed Turbulent Combustion

    NASA Technical Reports Server (NTRS)

    Secundov, A.; Bezgin, L.; Buriko, Yu.; Guskov, O.; Kopchenov, V.; Laskin, I.; Lomkov, K.; Tshepin, S.; Volkov, D.; Zaitsev, S.

    1996-01-01

    The current Final Report contains results of the study which was performed in Scientific Research Center 'ECOLEN' (Moscow, Russia). The study concerns the development and verification of non-expensive approach for modeling of supersonic turbulent diffusion flames based on flamelet consideration of the chemistry/turbulence interaction (FL approach). Research work included: development of the approach and CFD tests of the flamelet model for supersonic jet flames; development of the simplified procedure for solution of the flamelet equations based on partial equilibrium chemistry assumption; study of the flame ignition/extinction predictions provided by flamelet model. The performed investigation demonstrated that FL approach allowed to describe satisfactory main features of supersonic H 2/air jet flames. Model demonstrated also high capabilities for reduction of the computational expenses in CFD modeling of the supersonic flames taking into account detailed oxidation chemistry. However, some disadvantages and restrictions of the existing version of approach were found in this study. They were: (1) inaccuracy in predictions of the passive scalar statistics by our turbulence model for one of the considered test cases; and (2) applicability of the available version of the flamelet model to flames without large ignition delay distance only. Based on the results of the performed investigation, we formulated and submitted to the National Aeronautics and Space Administration our Project Proposal for the next step research directed toward further improvement of the FL approach.

  18. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  19. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. The results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  20. Multiple model analysis with discriminatory data collection (MMA-DDC): A new method for improving measurement selection

    NASA Astrophysics Data System (ADS)

    Kikuchi, C.; Ferre, P. A.; Vrugt, J. A.

    2011-12-01

    Hydrologic models are developed, tested, and refined based on the ability of those models to explain available hydrologic data. The optimization of model performance based upon mismatch between model outputs and real world observations has been extensively studied. However, identification of plausible models is sensitive not only to the models themselves - including model structure and model parameters - but also to the location, timing, type, and number of observations used in model calibration. Therefore, careful selection of hydrologic observations has the potential to significantly improve the performance of hydrologic models. In this research, we seek to reduce prediction uncertainty through optimization of the data collection process. A new tool - multiple model analysis with discriminatory data collection (MMA-DDC) - was developed to address this challenge. In this approach, multiple hydrologic models are developed and treated as competing hypotheses. Potential new data are then evaluated on their ability to discriminate between competing hypotheses. MMA-DDC is well-suited for use in recursive mode, in which new observations are continuously used in the optimization of subsequent observations. This new approach was applied to a synthetic solute transport experiment, in which ranges of parameter values constitute the multiple hydrologic models, and model predictions are calculated using likelihood-weighted model averaging. MMA-DDC was used to determine the optimal location, timing, number, and type of new observations. From comparison with an exhaustive search of all possible observation sequences, we find that MMA-DDC consistently selects observations which lead to the highest reduction in model prediction uncertainty. We conclude that using MMA-DDC to evaluate potential observations may significantly improve the performance of hydrologic models while reducing the cost associated with collecting new data.

Top