Science.gov

Sample records for acceptable model performance

  1. Trinity Acceptance Tests Performance Summary.

    SciTech Connect

    Rajan, Mahesh

    2015-12-01

    Ensuring Real Applications perform well on Trinity is key to success. Four components: ASC applications, Sustained System Performance (SSP), Extra-Large MiniApplications problems, and Micro-benchmarks.

  2. A Distributive Model of Treatment Acceptability

    ERIC Educational Resources Information Center

    Carter, Stacy L.

    2008-01-01

    A model of treatment acceptability is proposed that distributes overall treatment acceptability into three separate categories of influence. The categories are comprised of societal influences, consultant influences, and influences associated with consumers of treatments. Each of these categories are defined and their inter-relationships within…

  3. Integrated Model for E-Learning Acceptance

    NASA Astrophysics Data System (ADS)

    Ramadiani; Rodziah, A.; Hasan, S. M.; Rusli, A.; Noraini, C.

    2016-01-01

    E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.

  4. Model of aircraft passenger acceptance

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.

    1978-01-01

    A technique developed to evaluate the passenger response to a transportation system environment is described. Reactions to motion, noise, temperature, seating, ventilation, sudden jolts and descents are modeled. Statistics are presented for the age, sex, occupation, and income distributions of the candidates analyzed. Values are noted for the relative importance of system variables such as time savings, on-time arrival, convenience, comfort, safety, the ability to read and write, and onboard services.

  5. Employee Acceptance of BOS and BES Performance Appraisals.

    ERIC Educational Resources Information Center

    Dossett, Dennis L.; Gier, Joseph A.

    Previous research on performance evaluation systems has failed to take into account user acceptance. Employee acceptance of a behaviorally-based performance appraisal system was assessed in a field experiment contrasting user preference for Behavioral Expectations Scales (BES) versus Behavioral Observation Scales (BOS). Non-union sales associates…

  6. ASME PTC 46 -- Acceptance test code for overall plant performance

    SciTech Connect

    Friedman, J.R.; Yost, J.G.

    1999-11-01

    ASME published PTC 46 in 1996 after five years of development. PTC 46 is the first industry standard providing explicit procedures for conducting acceptance tests to determine the overall thermal performance and output of power generating units. It is applicable to any heat cycle power generating unit. This survey paper provides an overview of PTC 46 and discusses how PTC 46 can be used for acceptance testing of new combined cycle and fossil steam power generating units. Several technical papers have been previously presented that provide more detailed information and discussion on the use of PTC 46 in acceptance testing.

  7. Interrelationships among Employee Participation, Individual Differences, Goal Difficulty, Goal Acceptance, Goal Instrumentality, and Performance.

    ERIC Educational Resources Information Center

    Yukl, Gary A.; Latham, Gary P.

    1978-01-01

    Discussed is a model for goal setting, which is based on Locke's theory that difficult but clear and specific goals, if accepted, will result in higher performance than easy goals, nonspecific goals, or no goals at all. (Author/RK)

  8. Performance-based waste acceptance criteria preliminary baseline assumptions

    SciTech Connect

    Not Available

    1994-10-24

    The Department of Energy`s (DOE`s) strategy for the management of transuranic (TRU) and TRU mixed wastes has focused on the development of the Waste Isolation Pilot Plant (WIPP). The WIPP repository is designated to receive DOE defense wastes that meet the established criteria for acceptance. As a national strategy [DOE, 1993], DOE does not intend to treat candidate wastes unless treatment or processing are necessary to meet the safety, health, and regulatory criteria for transport and disposal at WIPP. The WIPP WAC has evolved over the past 10 years to include criteria and requirements in support of the Waste Characterization program and other related compliance programs. In aggregate, the final health, safety and regulatory criteria for the waste will be documented in the Disposal WAC. This document serves two purposes. First, it familiarizes regulators and stakeholders with the concept of performance based waste acceptance criteria as an augmentation within a final Waste Isolation Pilot Plant (WIPP) Waste Acceptance Criteria. Second, the document preliminarily identifies certain waste characteristics that appear important to the performance assessment process for WIPP; therefore, these could become component characteristics in the Performance Based Waste Acceptance Criteria (PBWAC). Identification of the final PBWAC will be accomplished through iterative runs of the System Prioritization Method (SPM). These iterations will serve to more clearly isolate and identify those waste characteristics that directly and predominately impact on the performance assessment.

  9. Measuring Technology Acceptance Level of Turkish Pre-Service English Teachers by Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Kirmizi, Özkan

    2014-01-01

    The aim of this study is to investigate technology acceptance of prospective English teachers by using Technology Acceptance Model (TAM) in Turkish context. The study is based on Structural Equation Model (SEM). The participants of the study from English Language Teaching Departments of Hacettepe, Gazi and Baskent Universities. The participants…

  10. The indicator performance estimate approach to determining acceptable wilderness conditions

    NASA Astrophysics Data System (ADS)

    Hollenhorst, Steven; Gardner, Lisa

    1994-11-01

    Using data from a study conducted in the Cranberry Wilderness Area of West Virginia, United States, this paper describes how a modified importance—performance approach can be used to prioritize wilderness indicators and determine how much change from the pristine is acceptable. The approach uses two key types of information: (1) indicator importance, or visitor opinion as to which wilderness indicators have the greatest influence on their experience, and (2) management performance, or the extent to which actual indicator conditions exceed or are within visitor expectations. Performance was represented by calculating indicator performance estimates (IPEs), as defined by standardized differences between actual conditions and visitor preferences for each indicator. The results for each indicator are then presented graphically on a four-quadrant matrix for objective interpretation. Each quadrant represents a management response: keep up the good work, concentrate here, low priority, or possible overkill. The technique allows managers to more systematically and effectively utilize information routinely collected during the limits of acceptable change wilderness planning process.

  11. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false Acceptable model codes. 41.120 Section 41.120 Transportation Office of the Secretary of Transportation SEISMIC SAFETY § 41.120 Acceptable model codes. (a) This... of this part. (b)(1) The following are model codes which have been found to provide a level...

  12. 49 CFR 41.120 - Acceptable model codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Acceptable model codes. 41.120 Section 41.120 Transportation Office of the Secretary of Transportation SEISMIC SAFETY § 41.120 Acceptable model codes. (a) This... of this part. (b)(1) The following are model codes which have been found to provide a level...

  13. Technological Diffusion within Educational Institutions: Applying the Technology Acceptance Model.

    ERIC Educational Resources Information Center

    Wolski, Stacy; Jackson, Sally

    Expectancy models of behavior such as the Theory of Reasoned Action (TRA) and the Technology Acceptance Model (TAM) offer guidelines that aid efforts to facilitate use of new technology. These models remind us that both acceptance of and resistance to technology use are grounded in beliefs and norms regarding the technology. Although TAM is widely…

  14. Evaluation of the Acceptance of Audience Response System by Corporations Using the Technology Acceptance Model

    NASA Astrophysics Data System (ADS)

    Chu, Hsing-Hui; Lu, Ta-Jung; Wann, Jong-Wen

    The purpose of this research is to explore enterprises' acceptance of Audience Response System (ARS) using Technology Acceptance Model (TAM). The findings show that (1) IT characteristics and facilitating conditions could be external variables of TAM. (2) The degree of E-business has positive significant correlation with behavioral intention of employees. (3) TAM is a good model to predict and explain IT acceptance. (4) Demographic variables, industry and firm characteristics have no significant correlation with ARS acceptance. The results provide useful information to managers and ARS providers that (1) ARS providers should focus more on creating different usages to enhance interactivity and employees' using intention. (2) Managers should pay attention to build sound internal facilitating conditions for introducing IT. (3) According to the degree of E-business, managers should set up strategic stages of introducing IT. (4) Providers should increase product promotion and also leverage academic and government to promote ARS.

  15. Modeling of the charge acceptance of lead-acid batteries

    NASA Astrophysics Data System (ADS)

    Thele, M.; Schiffer, J.; Karden, E.; Surewaard, E.; Sauer, D. U.

    This paper presents a model for flooded and VRLA batteries that is parameterized by impedance spectroscopy and includes the overcharging effects to allow charge-acceptance simulations (e.g. for regenerative-braking drive-cycle profiles). The full dynamic behavior and the short-term charge/discharge history is taken into account. This is achieved by a detailed modeling of the sulfate crystal growth and modeling of the internal gas recombination cycle. The model is applicable in the full realistic temperature and current range of automotive applications. For model validation, several load profiles (covering the dynamics and the current range appearing in electrically assisted or hybrid cars) are examined and the charge-acceptance limiting effects are elaborately discussed. The validation measurements have been performed for different types of lead-acid batteries (flooded and VRLA). The model is therefore an important tool for the development of automotive power nets, but it also allows to analyze different charging strategies and energy gains which can be achieved during regenerative-braking.

  16. Development of Performance Acceptance Test Guidelines for Large Commercial Parabolic Trough Solar Fields: Preprint

    SciTech Connect

    Kearney, D.; Mehos, M.

    2010-12-01

    Prior to commercial operation, large solar systems in utility-size power plants need to pass a performance acceptance test conducted by the EPC contractor or owners. In lieu of the present absence of engineering code developed for this purpose, NREL has undertaken the development of interim guidelines to provide recommendations for test procedures that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. The fundamental differences between acceptance of a solar power plant and a conventional fossil-fired plant are the transient nature of the energy source and the necessity to utilize an analytical performance model in the acceptance process. These factors bring into play the need to establish methods to measure steady state performance, potential impacts of transient processes, comparison to performance model results, and the possible requirement to test, or model, multi-day performance within the scope of the acceptance test procedure. The power block and BOP are not within the boundaries of this guideline. The current guideline is restricted to the solar thermal performance of parabolic trough systems and has been critiqued by a broad range of stakeholders in CSP development and technology.

  17. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Park, Eunil; Kim, Ki Joon

    2013-01-01

    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  18. Examining Engineering & Technology Students' Acceptance of Network Virtualization Technology Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Yousif, Wael K.

    2010-01-01

    This causal and correlational study was designed to extend the Technology Acceptance Model (TAM) and to test its applicability to Valencia Community College (VCC) Engineering and Technology students as the target user group when investigating the factors influencing their decision to adopt and to utilize VMware as the target technology. In…

  19. User Acceptance of YouTube for Procedural Learning: An Extension of the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Lee, Doo Young; Lehto, Mark R.

    2013-01-01

    The present study was framed using the Technology Acceptance Model (TAM) to identify determinants affecting behavioral intention to use YouTube. Most importantly, this research emphasizes the motives for using YouTube, which is notable given its extrinsic task goal of being used for procedural learning tasks. Our conceptual framework included two…

  20. Modeling eBook acceptance: A study on mathematics teachers

    NASA Astrophysics Data System (ADS)

    Jalal, Azlin Abd; Ayub, Ahmad Fauzi Mohd; Tarmizi, Rohani Ahmad

    2014-12-01

    The integration and effectiveness of eBook utilization in Mathematics teaching and learning greatly relied upon the teachers, hence the need to understand their perceptions and beliefs. The eBook, an individual laptop completed with digitized textbook sofwares, were provided for each students in line with the concept of 1 student:1 laptop. This study focuses on predicting a model on the acceptance of the eBook among Mathematics teachers. Data was collected from 304 mathematics teachers in selected schools using a survey questionnaire. The selection were based on the proportionate stratified sampling. Structural Equation Modeling (SEM) were employed where the model was tested and evaluated and was found to have a good fit. The variance explained for the teachers' attitude towards eBook is approximately 69.1% where perceived usefulness appeared to be a stronger determinant compared to perceived ease of use. This study concluded that the attitude of mathematics teachers towards eBook depends largely on the perception of how useful the eBook is on improving their teaching performance, implying that teachers should be kept updated with the latest mathematical application and sofwares to use with the eBook to ensure positive attitude towards using it in class.

  1. Predicting User Acceptance of Collaborative Technologies: An Extension of the Technology Acceptance Model for E-Learning

    ERIC Educational Resources Information Center

    Cheung, Ronnie; Vogel, Doug

    2013-01-01

    Collaborative technologies support group work in project-based environments. In this study, we enhance the technology acceptance model to explain the factors that influence the acceptance of Google Applications for collaborative learning. The enhanced model was empirically evaluated using survey data collected from 136 students enrolled in a…

  2. Acceptance of health information technology in health professionals: an application of the revised technology acceptance model.

    PubMed

    Ketikidis, Panayiotis; Dimitrovski, Tomislav; Lazuras, Lambros; Bath, Peter A

    2012-06-01

    The response of health professionals to the use of health information technology (HIT) is an important research topic that can partly explain the success or failure of any HIT application. The present study applied a modified version of the revised technology acceptance model (TAM) to assess the relevant beliefs and acceptance of HIT systems in a sample of health professionals (n = 133). Structured anonymous questionnaires were used and a cross-sectional design was employed. The main outcome measure was the intention to use HIT systems. ANOVA was employed to examine differences in TAM-related variables between nurses and medical doctors, and no significant differences were found. Multiple linear regression analysis was used to assess the predictors of HIT usage intentions. The findings showed that perceived ease of use, but not usefulness, relevance and subjective norms directly predicted HIT usage intentions. The present findings suggest that a modification of the original TAM approach is needed to better understand health professionals' support and endorsement of HIT. Perceived ease of use, relevance of HIT to the medical and nursing professions, as well as social influences, should be tapped by information campaigns aiming to enhance support for HIT in healthcare settings. PMID:22733680

  3. Involvement in Extracurricular Activities as Related to Academic Performance, Personality, and Peer Acceptance.

    ERIC Educational Resources Information Center

    Fung, Yee-wang; Wong, Ngai-ying

    1991-01-01

    Reveals findings of a survey of 294 Hong Kong secondary school students. Evaluates relationships among involvement in extracurricular activities, academic performance, personality, and peer acceptance. Concludes that activity involvement is positively related to academic performance, personality, and peer acceptance. Suggests that further research…

  4. Modeling of a Parabolic Trough Solar Field for Acceptance Testing: A Case Study

    SciTech Connect

    Wagner, M. J.; Mehos, M. S.; Kearney, D. W.; McMahan, A. C.

    2011-01-01

    As deployment of parabolic trough concentrating solar power (CSP) systems ramps up, the need for reliable and robust performance acceptance test guidelines for the solar field is also amplified. Project owners and/or EPC contractors often require extensive solar field performance testing as part of the plant commissioning process in order to ensure that actual solar field performance satisfies both technical specifications and performance guaranties between the involved parties. Performance test code work is currently underway at the National Renewable Energy Laboratory (NREL) in collaboration with the SolarPACES Task-I activity, and within the ASME PTC-52 committee. One important aspect of acceptance testing is the selection of a robust technology performance model. NREL1 has developed a detailed parabolic trough performance model within the SAM software tool. This model is capable of predicting solar field, sub-system, and component performance. It has further been modified for this work to support calculation at subhourly time steps. This paper presents the methodology and results of a case study comparing actual performance data for a parabolic trough solar field to the predicted results using the modified SAM trough model. Due to data limitations, the methodology is applied to a single collector loop, though it applies to larger subfields and entire solar fields. Special consideration is provided for the model formulation, improvements to the model formulation based on comparison with the collected data, and uncertainty associated with the measured data. Additionally, this paper identifies modeling considerations that are of particular importance in the solar field acceptance testing process and uses the model to provide preliminary recommendations regarding acceptable steady-state testing conditions at the single-loop level.

  5. Technology Acceptance and Performance: An Investigation into Requisite Knowledge.

    ERIC Educational Resources Information Center

    Marshall, Thomas E.; Byrd, Terry A.; Gardiner, Lorraine R.; Rainer, R. Kelly, Jr.

    2000-01-01

    Describes an empirical study that investigated how knowledge bases contributed to subjects' attitudes and performance in the use of a computer-assisted software engineering (CASE) tool in database design. Identifies requisite knowledge bases and provides alternatives for organization administration to promote more positive attitudes toward…

  6. The role of acceptance and job control in mental health, job satisfaction, and work performance.

    PubMed

    Bond, Frank W; Bunce, David

    2003-12-01

    Acceptance, the willingness to experience thoughts, feelings, and physiological sensations without having to control them or let them determine one's actions, is a major individual determinant of mental health and behavioral effectiveness in a more recent theory of psychopathology. This 2-wave panel study examined the ability of acceptance also to explain mental health, job satisfaction, and performance in the work domain. The authors hypothesized that acceptance would predict these 3 outcomes 1 year later in a sample of customer service center workers in the United Kingdom (N = 412). Results indicated that acceptance predicted mental health and an objective measure of performance over and above job control, negative affectivity, and locus of control. These beneficial effects of having more job control were enhanced when people had higher levels of acceptance. The authors discuss the theoretical and practical relevance of this individual characteristic to occupational health and performance. PMID:14640816

  7. A Comparative Evaluation of the Technical Performance and User Acceptance of Two Prototype Online Catalog Systems.

    ERIC Educational Resources Information Center

    Siegel, Elliot R.; And Others

    1984-01-01

    Describes research strategy and methods of comparative evaluation conducted by the National Library of Medicine to assess user acceptance and technical performance of two prototype patron accessible online catalog systems within same operational environment. User acceptance studies included sample search experiment, comparison search experiment,…

  8. Factors Influencing the Acceptance of Web-Based Training in Malaysia: Applying the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Hashim, Junaidah

    2008-01-01

    Companies in Malaysia are beginning to use web-based training to reduce the cost of training and to provide employees with greater access to instruction. However, some people are uncomfortable with technology and prefer person-to-person methods of training. This study examines the acceptance of web-based training among a convenience sample of 261…

  9. Effects of acceptance-based coping on task performance and subjective stress.

    PubMed

    Kishita, Naoko; Shimada, Hironori

    2011-03-01

    This paper examines the interactive effects of acceptance-based coping and job control on task performance, subjective stress, and perceived control. Forty-eight undergraduate and graduate students first participated in brief educational programs based on either acceptance or control coping strategies. They then participated in a 30-min high workload task under either high or low job control conditions. The results demonstrated a significant interactive effect of acceptance-based coping and job control on perceived control and task performance. No such effect was found for subjective stress. We conclude that to improve employees' perceived control and job performance, there should be an increase not only in job control through work redesign, but also in psychological acceptance. PMID:21074000

  10. The History of UTAUT Model and Its Impact on ICT Acceptance and Usage by Academicians

    ERIC Educational Resources Information Center

    Oye, N. D.; Iahad, N. A.; Rahim, N. Ab.

    2014-01-01

    This paper started with the review of the history of technology acceptance model from TRA to UTAUT. The expected contribution is to bring to lime light the current development stage of the technology acceptance model. Based on this, the paper examined the impact of UTAUT model on ICT acceptance and usage in HEIs. The UTAUT model theory was…

  11. Influence of Gender and Computer Teaching Efficacy on Computer Acceptance among Malaysian Student Teachers: An Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Wong, Kung-Teck; Teo, Timothy; Russo, Sharon

    2012-01-01

    The purpose of this study is to validate the technology acceptance model (TAM) in an educational context and explore the role of gender and computer teaching efficacy as external variables. From the literature, it appeared that only limited studies had developed models to explain statistically the chain of influence of computer teaching efficacy…

  12. Modelling acceptance of sunlight in high and low photovoltaic concentration

    SciTech Connect

    Leutz, Ralf

    2014-09-26

    A simple model incorporating linear radiation characteristics, along with the optical trains and geometrical concentration ratios of solar concentrators is presented with performance examples for optical trains of HCPV, LCPV and benchmark flat-plate PV.

  13. Modeling Computer Usage Intentions of Tertiary Students in a Developing Country through the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Afari-Kumah, Eben; Achampong, Akwasi Kyere

    2010-01-01

    This study aims to examine the computer usage intentions of Ghanaian Tertiary Students. The Technology Acceptance Model was adopted as the theoretical framework to ascertain whether it could help explain behavioral intentions of individuals to accept and use technology. Factor analysis was used to assess the construct validity of the initial…

  14. Cassini RTG acceptance test results and RTG performance on Galileo and Ulysses

    SciTech Connect

    Kelly, C.E.; Klee, P.M.

    1997-06-01

    Flight acceptance testing has been completed for the RTGs to be used on the Cassini spacecraft which is scheduled for an October 6, 1997 launch to Saturn. The acceptance test program includes vibration tests, magnetic field measurements, properties (weight and c.g.) and thermal vacuum test. This paper presents The thermal vacuum test results. Three RTGs are to be used, F-2, F-6, and F-7. F-5 is tile back-up RTG, as it was for the Galileo and Ulysses missions launched in 1989 and 1990, respectively. RTG performance measured during the thermal vacuum tests carried out at die Mound Laboratory facility met all specification requirements. Beginning of mission (BOM) and end of mission (EOM) power predictions have been made based on than tests results. BOM power is predicted to be 888 watts compared to the minimum requirement of 826 watts. Degradation models predict the EOM power after 16 years is to be 640 watts compared to a minimum requirement of 596 watts. Results of small scale module tests are also showing. The modules contain couples from the qualification and flight production runs. The tests have exceeded 28,000 hours (3.2 years) and are continuing to provide increased confidence in the predicted long term performance of the Cassini RTGs. All test results indicate that the power requirements of the Cassini spacecraft will be met. BOM and EOM power margins of over five percent are predicted. Power output from telemetry for the two Galileo RTGs are shown from the 1989 launch to the recent Jupiter encounter. Comparisons of predicted, measured and required performance are shown. Telemetry data are also shown for the RTG on the Ulysses spacecraft which completed its planned mission in 1995 and is now in the extended mission.

  15. Cassini RTG acceptance test results and RTG performance on Galileo and Ulysses

    SciTech Connect

    Kelly, C.E.; Klee, P.M.

    1997-12-31

    Flight acceptance testing has been completed for the RTGs to be used on the Cassini spacecraft which is scheduled for an October 6, 1997 launch to Saturn. The acceptance test program includes vibration tests, magnetic field measurements, mass properties (weight and c.g.) and thermal vacuum test. This paper presents the thermal vacuum test results. Three RTGs are to be used, F-2, F-6, and F-7. F-5 is the backup RTG, as it was for the Galileo and Ulysses missions launched in 1989 and 1990, respectively. RTG performance measured during the thermal vacuum tests carried out at the Mound Laboratory facility met all specification requirements. Beginning of mission (BOM) and end of mission (EOM) power predictions have been made based on these tests results. BOM power is predicted to be 888 watts compared to the minimum requirement of 826 watts. Degradation models predict the EOM power after 16 years is to be 640 watts compared to a minimum requirement of 596 watts. Results of small scale module tests are also shown. The modules contain couples from the qualification and flight production runs. The tests have exceeded 28,000 hours (3.2 years) and are continuing to provide increased confidence in the predicted long term performance of the Cassini RTGs. All test results indicate that the power requirements of the Cassini spacecraft will be met. BOM and EOM power margins of over 5% are predicted. Power output from telemetry for the two Galileo RTGs are shown from the 1989 launch to the recent Jupiter encounter. Comparisons of predicted, measured and required performance are shown. Telemetry data are also shown for the RTG on the Ulysses spacecraft which completed its planned mission in 1995 and is now in the extended mission.

  16. Cassini RTG Acceptance Test Results and RTG Performance on Galileo and Ulysses

    DOE R&D Accomplishments Database

    Kelly, C. E.; Klee, P. M.

    1997-06-01

    Flight acceptance testing has been completed for the RTGs to be used on the Cassini spacecraft which is scheduled for an October 6, 1997 launch to Saturn. The acceptance test program includes vibration tests, magnetic field measurements, properties (weight and c.g.) and thermal vacuum test. This paper presents The thermal vacuum test results. Three RTGs are to be used, F 2, F 6, and F 7. F 5 is tile back up RTG, as it was for the Galileo and Ulysses missions launched in 1989 and 1990, respectively. RTG performance measured during the thermal vacuum tests carried out at die Mound Laboratory facility met all specification requirements. Beginning of mission (BOM) and end of mission (EOM) power predictions have been made based on than tests results. BOM power is predicted to be 888 watts compared to the minimum requirement of 826 watts. Degradation models predict the EOM power after 16 years is to be 640 watts compared to a minimum requirement of 596 watts. Results of small scale module tests are also showing. The modules contain couples from the qualification and flight production runs. The tests have exceeded 28,000 hours (3.2 years) and are continuing to provide increased confidence in the predicted long term performance of the Cassini RTGs. All test results indicate that the power requirements of the Cassini spacecraft will be met. BOM and EOM power margins of over five percent are predicted. Power output from telemetry for the two Galileo RTGs are shown from the 1989 launch to the recent Jupiter encounter. Comparisons of predicted, measured and required performance are shown. Telemetry data are also shown for the RTG on the Ulysses spacecraft which completed its planned mission in 1995 and is now in the extended mission.

  17. IR DIAL performance modeling

    SciTech Connect

    Sharlemann, E.T.

    1994-07-01

    We are developing a DIAL performance model for CALIOPE at LLNL. The intent of the model is to provide quick and interactive parameter sensitivity calculations with immediate graphical output. A brief overview of the features of the performance model is given, along with an example of performance calculations for a non-CALIOPE application.

  18. Performance deterioration due to acceptance testing and flight loads; JT90 jet engine diagnostic program

    NASA Technical Reports Server (NTRS)

    Olsson, W. J.

    1982-01-01

    The results of a flight loads test of the JT9D-7 engine are presented. The goals of this test program were to: measure aerodynamic and inertia loads on the engine during flight, explore the effects of airplane gross weight and typical maneuvers on these flight loads, simultaneously measure the changes in engine running clearances and performance resulting from the maneuvers, make refinements of engine performance deterioration prediction models based on analytical results of the tests, and make recommendations to improve propulsion system performance retention. The test program included a typical production airplane acceptance test plus additional flights and maneuvers to encompass the range of flight loads in revenue service. The test results indicated that aerodynamic loads, primarily at take-off, were the major cause of rub-indicated that aerodynamic loads, primarily at take-off, were the major cause of rub-induced deterioration in the cold sectin of the engine. Differential thermal expansion between rotating and static parts plus aerodynamic loads combined to cause blade-to-seal rubs in the turbine.

  19. User Acceptance of Information Technology: Theories and Models.

    ERIC Educational Resources Information Center

    Dillon, Andrew; Morris, Michael G.

    1996-01-01

    Reviews literature in user acceptance and resistance to information technology design and implementation. Examines innovation diffusion, technology design and implementation, human-computer interaction, and information systems. Concentrates on the determinants of user acceptance and resistance and emphasizes how researchers and developers can…

  20. Do I Have to Learn Something New? Mental Models and the Acceptance of Replacement Technologies

    ERIC Educational Resources Information Center

    Zhang, Wei; Xu, Peng

    2011-01-01

    Few studies in technology acceptance have explicitly addressed the acceptance of replacement technologies, technologies that replace legacy ones that have been in use. This article explores this issue through the theoretical lens of mental models. We contend that accepting replacement technologies entails both mental model maintenance and mental…

  1. Restrictions on TWT Helix Voltage Ripple for Acceptable Notch Filter Performance

    SciTech Connect

    Hyslop, B.

    1984-12-01

    An ac ripple on the helix voltage of the 1-2 GHz TWT's creates FM sidebands that cause amplitude and phase modulation of the microwave TWT output signal. A limit of 16 volts peak-to-peak is required for acceptable superconducting notch filter performance.

  2. A FORTRAN IV Program for Multiple-choice Tests with Predetermined Minimal Acceptable Performance Levels

    ERIC Educational Resources Information Center

    Noe, Michael J.

    1976-01-01

    A Fortran IV multiple choice test scoring program for an IBM 370 computer is described that computes minimally acceptable performance levels and compares student scores to these levels. The program accomodates up to 500 items with no more than nine alternatives from a group of examinees numbering less than 10,000. (Author)

  3. Improving International-Level Chess Players' Performance with an Acceptance-Based Protocol: Preliminary Findings

    ERIC Educational Resources Information Center

    Ruiz, Francisco J.; Luciano, Carmen

    2012-01-01

    This study compared an individual, 4-hr intervention based on acceptance and commitment therapy (ACT) versus a no-contact control condition in improving the performance of international-level chess players. Five participants received the brief ACT protocol, with each matched to another chess player with similar characteristics in the control…

  4. Explanation of Police Officers' Information Technology Acceptance Using the Technology Acceptance Model and Social Cognitive Theory

    ERIC Educational Resources Information Center

    Delice, Murat

    2009-01-01

    In the last decades, information technology (IT) has touched every aspect of life. Computers have been used in a great range of fields such as education, government, business, entertainment, and daily life. Similar to other organizations, police organizations use IT systems to improve their effectiveness and performance. However, police…

  5. Photovoltaic array performance model.

    SciTech Connect

    Kratochvil, Jay A.; Boyson, William Earl; King, David L.

    2004-08-01

    This document summarizes the equations and applications associated with the photovoltaic array performance model developed at Sandia National Laboratories over the last twelve years. Electrical, thermal, and optical characteristics for photovoltaic modules are included in the model, and the model is designed to use hourly solar resource and meteorological data. The versatility and accuracy of the model has been validated for flat-plate modules (all technologies) and for concentrator modules, as well as for large arrays of modules. Applications include system design and sizing, 'translation' of field performance measurements to standard reporting conditions, system performance optimization, and real-time comparison of measured versus expected system performance.

  6. The Effect of a Summer MCAT Performance Improvement Program on Minority Medical Student Acceptance

    PubMed Central

    Medina, Miguel A.

    1987-01-01

    The effect of a commercial Medical College Admission Test (MCAT) review course on MCAT retake scores and acceptance into medical school for a group of minority students is reported. The review course enhanced MCAT performance in all of the subgroups. The increase in total MCAT score was more pronounced in students with an initial MCAT score below 36 or a high undergraduate total or science grade point average. Results suggest a relationship between MCAT performance and medical school admission. PMID:3625795

  7. The Adult Roles Models Program: Feasibility, Acceptability, and Initial Outcomes

    PubMed Central

    Silver, Ellen Johnson; Dean, Randa; Perez, Amanda; Rivera, Angelic

    2014-01-01

    We present the feasibility and acceptability of a parent sexuality education program led by peer educators in community settings. We also report the results of an outcome evaluation with 71 parents who were randomized to the intervention or a control group, and surveyed one month prior to and six months after the 4-week intervention. The program was highly feasible and acceptable to participants, and the curriculum was implemented with a high level of fidelity and facilitator quality. Pilot data show promising outcomes for increasing parental knowledge, communication, and monitoring of their adolescent children. PMID:24883051

  8. Hybrid E-Learning Acceptance Model: Learner Perceptions

    ERIC Educational Resources Information Center

    Ahmed, Hassan M. Selim

    2010-01-01

    E-learning tools and technologies have been used to supplement conventional courses in higher education institutions creating a "hybrid" e-learning module that aims to enhance the learning experiences of students. Few studies have addressed the acceptance of hybrid e-learning by learners and the factors affecting the learners'…

  9. Utility-Scale Power Tower Solar Systems: Performance Acceptance Test Guidelines

    SciTech Connect

    Kearney, D.

    2013-03-01

    The purpose of these Guidelines is to provide direction for conducting performance acceptance testing for large power tower solar systems that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. The recommendations have been developed under a National Renewable Energy Laboratory (NREL) subcontract and reviewed by stakeholders representing concerned organizations and interests throughout the concentrating solar power (CSP) community. An earlier NREL report provided similar guidelines for parabolic trough systems. These Guidelines recommend certain methods, instrumentation, equipment operating requirements, and calculation methods. When tests are run in accordance with these Guidelines, we expect that the test results will yield a valid indication of the actual performance of the tested equipment. But these are only recommendations--to be carefully considered by the contractual parties involved in the Acceptance Tests--and we expect that modifications may be required to fit the particular characteristics of a specific project.

  10. Speculations on Performance Models.

    ERIC Educational Resources Information Center

    Fromkin, Victoria

    1968-01-01

    According to the author, competence and performance and their interrelationships are the concern of linguistics. Performance models must: (1) be based on physical data of speech; (2) describe the phenomena under investigation; (3) predict events which are confirmed by experiment; (4) suggest causal relationships by identifying necessary and…

  11. VENTURI SCRUBBER PERFORMANCE MODEL

    EPA Science Inventory

    The paper presents a new model for predicting the particle collection performance of venturi scrubbers. It assumes that particles are collected by atomized liquid only in the throat section. The particle collection mechanism is inertial impaction, and the model uses a single drop...

  12. Acceptance and Commitment Therapy as a Unified Model of Behavior Change

    ERIC Educational Resources Information Center

    Hayes, Steven C.; Pistorello, Jacqueline; Levin, Michael E.

    2012-01-01

    The present article summarizes the assumptions, model, techniques, evidence, and diversity/social justice commitments of Acceptance and Commitment Therapy (ACT). ACT focused on six processes (acceptance, defusion, self, now, values, and action) that bear on a single overall target (psychological flexibility). The ACT model of behavior change has…

  13. Ion thruster performance model

    NASA Technical Reports Server (NTRS)

    Brophy, J. R.

    1984-01-01

    A model of ion thruster performance is developed for high flux density, cusped magnetic field thruster designs. This model is formulated in terms of the average energy required to produce an ion in the discharge chamber plasma and the fraction of these ions that are extracted to form the beam. The direct loss of high energy (primary) electrons from the plasma to the anode is shown to have a major effect on thruster performance. The model provides simple algebraic equations enabling one to calculate the beam ion energy cost, the average discharge chamber plasma ion energy cost, the primary electron density, the primary-to-Maxwellian electron density ratio and the Maxwellian electron temperature. Experiments indicate that the model correctly predicts the variation in plasma ion energy cost for changes in propellant gas (Ar, Kr and Xe), grid transparency to neutral atoms, beam extraction area, discharge voltage, and discharge chamber wall temperature. The model and experiments indicate that thruster performance may be described in terms of only four thruster configuration dependent parameters and two operating parameters. The model also suggests that improved performance should be exhibited by thruster designs which extract a large fraction of the ions produced in the discharge chamber, which have good primary electron and neutral atom containment and which operate at high propellant flow rates.

  14. Design and performance of AERHA, a high acceptance high resolution soft x-ray spectrometer

    NASA Astrophysics Data System (ADS)

    Chiuzbǎian, Sorin G.; Hague, Coryn F.; Avila, Antoine; Delaunay, Renaud; Jaouen, Nicolas; Sacchi, Maurizio; Polack, François; Thomasset, Muriel; Lagarde, Bruno; Nicolaou, Alessandro; Brignolo, Stefania; Baumier, Cédric; Lüning, Jan; Mariot, Jean-Michel

    2014-04-01

    A soft x-ray spectrometer based on the use of an elliptical focusing mirror and a plane varied line spacing grating is described. It achieves both high resolution and high overall efficiency while remaining relatively compact. The instrument is dedicated to resonant inelastic x-ray scattering studies. We set out how this optical arrangement was judged best able to guarantee performance for the 50 - 1000 eV range within achievable fabrication targets. The AERHA (adjustable energy resolution high acceptance) spectrometer operates with an effective angular acceptance between 100 and 250 μsr (energy dependent) and a resolving power well in excess of 5000 according to the Rayleigh criterion. The high angular acceptance is obtained by means of a collecting pre-mirror. Three scattering geometries are available to enable momentum dependent measurements with 135°, 90°, and 50° scattering angles. The instrument operates on the Synchrotron SOLEIL SEXTANTS beamline which serves as a high photon flux 2 × 200 μm2 focal spot source with full polarization control.

  15. Air Traffic Controller Performance and Acceptability of Multiple UAS in a Simulated NAS Environment

    NASA Technical Reports Server (NTRS)

    Vu, Kim-Phuong L.; Strybel, Thomas; Chiappe, Dan; Morales, Greg; Battiste, Vernol; Shively, Robert Jay

    2014-01-01

    Previously, we showed that air traffic controllers (ATCos) rated UAS pilot verbal response latencies as acceptable when a 1.5 s delay was added to the UAS pilot responses, but a 5 s delay was rated as mostly unacceptable. In the present study we determined whether a 1.5 s added delay in the UAS pilots' verbal communications would affect ATCos interactions with UAS and other conventional aircraft when the number and speed of the UAS were manipulated. Eight radar-certified ATCos participated in this simulation. The ATCos managed a medium altitude sector containing arrival aircraft, en route aircraft, and one to four UAS. The UAS were conducting a surveillance mission and flew at either a "slow" or "fast" speed. We measured both UAS and conventional pilots' verbal communication latencies, and obtained ATCos' acceptability ratings for these latencies. Although the UAS pilot response latencies were longer than those of conventional pilots, the ATCos rated UAS pilot verbal communication latencies to be as acceptable as those of conventional pilots. Because the overall traffic load within the sector was held constant, ATCos only performed slightly worse when multiple UAS were in their sector compared to when only one UAS was in the sector. Implications of these findings for UAS integration in the NAS are discussed.

  16. Design and performance of AERHA, a high acceptance high resolution soft x-ray spectrometer

    SciTech Connect

    Chiuzbăian, Sorin G. Hague, Coryn F.; Brignolo, Stefania; Baumier, Cédric; Lüning, Jan; Avila, Antoine; Delaunay, Renaud; Mariot, Jean-Michel; Jaouen, Nicolas; Polack, François; Thomasset, Muriel; Lagarde, Bruno; Nicolaou, Alessandro; Sacchi, Maurizio

    2014-04-15

    A soft x-ray spectrometer based on the use of an elliptical focusing mirror and a plane varied line spacing grating is described. It achieves both high resolution and high overall efficiency while remaining relatively compact. The instrument is dedicated to resonant inelastic x-ray scattering studies. We set out how this optical arrangement was judged best able to guarantee performance for the 50 − 1000 eV range within achievable fabrication targets. The AERHA (adjustable energy resolution high acceptance) spectrometer operates with an effective angular acceptance between 100 and 250 μsr (energy dependent) and a resolving power well in excess of 5000 according to the Rayleigh criterion. The high angular acceptance is obtained by means of a collecting pre-mirror. Three scattering geometries are available to enable momentum dependent measurements with 135°, 90°, and 50° scattering angles. The instrument operates on the Synchrotron SOLEIL SEXTANTS beamline which serves as a high photon flux 2 × 200 μm{sup 2} focal spot source with full polarization control.

  17. Performance feedback: An exploratory study to examine the acceptability and impact for interdisciplinary primary care teams

    PubMed Central

    2011-01-01

    Background This mixed methods study was designed to explore the acceptability and impact of feedback of team performance data to primary care interdisciplinary teams. Methods Seven interdisciplinary teams were offered a one-hour, facilitated performance feedback session presenting data from a comprehensive, previously-conducted evaluation, selecting highlights such as performance on chronic disease management, access, patient satisfaction and team function. Results Several recurrent themes emerged from participants' surveys and two rounds of interviews within three months of the feedback session. Team performance measurement and feedback was welcomed across teams and disciplines. This feedback could build the team, the culture, and the capacity for quality improvement. However, existing performance indicators do not equally reflect the role of different disciplines within an interdisciplinary team. Finally, the effect of team performance feedback on intentions to improve performance was hindered by a poor understanding of how the team could use the data. Conclusions The findings further our understanding of how performance feedback may engage interdisciplinary team members in improving the quality of primary care and the unique challenges specific to these settings. There is a need to develop a shared sense of responsibility and agenda for quality improvement. Therefore, more efforts to develop flexible and interactive performance-reporting structures (that better reflect contributions from all team members) in which teams could specify the information and audience may assist in promoting quality improvement. PMID:21443806

  18. Expectancies Underlying the Acceptability of Handicaps: The Pervasiveness of the Medical Model

    ERIC Educational Resources Information Center

    Abroms, Kippy; Kodera, Thomas L.

    1978-01-01

    Two groups of undergraduate students with diverse backgrounds ranked the acceptability of 15 handicapping conditions of which some were medical disorders and others were sociopsychological or functional impairments. Students adhered to the medical model, basing their judgments of acceptability on the amenability of a given handicap to medical…

  19. NCCDS performance model

    NASA Technical Reports Server (NTRS)

    Richmond, Eric; Vallone, Antonio

    1994-01-01

    The NASA/GSFC Network Control Center (NCC) provides communication services between ground facilities and spacecraft missions in near-earth orbit that use the Space Network. The NCC Data System (NCCDS) provides computational support and is expected to be highly utilized by the service requests needed in the future years. A performance model of the NCCDS has been developed to assess the future workload and possible enhancements. The model computes message volumes from mission request profiles and SN resource levels and generates the loads for NCCDS configurations as a function of operational scenarios and processing activities. The model has been calibrated using the results of benchmarks performed on the operational NCCDS facility and used to assess some future SN service request scenarios.

  20. Acceptability of quality reporting and pay for performance among primary health centers in Lebanon.

    PubMed

    Saleh, Shadi S; Alameddine, Mohamad S; Natafgi, Nabil M

    2013-01-01

    Primary health care (PHC) is emphasized as the cornerstone of any health care system. Enhancing PHC performance is considered a strategy to enhance effective and equitable access to care. This study assesses the acceptability of and factors associated with quality reporting among PHC centers (PHCCs) in Lebanon. The managers of 132 Lebanese Ministry of Health PHCCs were surveyed using a cross-sectional design. Managers' willingness to report quality, participate in comparative quality assessments, and endorse pay-for-performance schemes was evaluated. Collected data were matched to the infrastructural characteristics and services database. Seventy-six percent of managers responded to the questionnaire, 93 percent of whom were willing to report clinical performance. Most expressed strong support for peer-performance comparison and pay-for-performance schemes. Willingness to report was negatively associated with the religious affiliation of centers and presence of health care facilities in the catchment area and favorably associated with use of information systems and the size of population served. The great willingness of PHCC managers to employ quality-enhancing initiatives flags a policy priority for PHC stakeholders to strengthen PHCC infrastructure and to enable reporting in an easy, standardized, and systematic way. Enhancing equity necessitates education and empowerment of managers in remote areas and those managing religiously affiliated centers. PMID:24397238

  1. Family support and acceptance, gay male identity formation, and psychological adjustment: a path model.

    PubMed

    Elizur, Y; Ziv, M

    2001-01-01

    While heterosexist family undermining has been demonstrated to be a developmental risk factor in the life of persons with same-gender orientation, the issue of protective family factors is both controversial and relatively neglected. In this study of Israeli gay males (N = 114), we focused on the interrelations of family support, family acceptance and family knowledge of gay orientation, and gay male identity formation, and their effects on mental health and self-esteem. A path model was proposed based on the hypotheses that family support, family acceptance, family knowledge, and gay identity formation have an impact on psychological adjustment, and that family support has an effect on gay identity formation that is mediated by family acceptance. The assessment of gay identity formation was based on an established stage model that was streamlined for cross-cultural practice by defining three basic processes of same-gender identity formation: self-definition, self-acceptance, and disclosure (Elizur & Mintzer, 2001). The testing of our conceptual path model demonstrated an excellent fit with the data. An alternative model that hypothesized effects of gay male identity on family acceptance and family knowledge did not fit the data. Interpreting these results, we propose that the main effect of family support/acceptance on gay identity is related to the process of disclosure, and that both general family support and family acceptance of same-gender orientation play a significant role in the psychological adjustment of gay men. PMID:11444052

  2. Electronic Health Record Patient Portal Adoption by Health Care Consumers: An Acceptance Model and Survey

    PubMed Central

    2016-01-01

    Background The future of health care delivery is becoming more citizen centered, as today’s user is more active, better informed, and more demanding. Worldwide governments are promoting online health services, such as electronic health record (EHR) patient portals and, as a result, the deployment and use of these services. Overall, this makes the adoption of patient-accessible EHR portals an important field to study and understand. Objective The aim of this study is to understand the factors that drive individuals to adopt EHR portals. Methods We applied a new adoption model using, as a starting point, Ventkatesh's Unified Theory of Acceptance and Use of Technology in a consumer context (UTAUT2) by integrating a new construct specific to health care, a new moderator, and new relationships. To test the research model, we used the partial least squares (PLS) causal modelling approach. An online questionnaire was administrated. We collected 360 valid responses. Results The statistically significant drivers of behavioral intention are performance expectancy (beta=.200; t=3.619), effort expectancy (beta=.185; t=2.907), habit (beta=.388; t=7.320), and self-perception (beta=.098; t=2.285). The predictors of use behavior are habit (beta=0.206; t=2.752) and behavioral intention (beta=0.258; t=4.036). The model explained 49.7% of the variance in behavioral intention and 26.8% of the variance in use behavior. Conclusions Our research helps to understand the desired technology characteristics of EHR portals. By testing an information technology acceptance model, we are able to determine what is more valued by patients when it comes to deciding whether to adopt EHR portals or not. The inclusion of specific constructs and relationships related to the health care consumer area also had a significant impact on understanding the adoption of EHR portals. PMID:26935646

  3. Modelling of biogas extraction at an Italian landfill accepting mechanically and biologically treated municipal solid waste.

    PubMed

    Calabrò, Paolo S; Orsi, Sirio; Gentili, Emiliano; Carlo, Meoni

    2011-12-01

    This paper presents the results of the modelling of the biogas extraction in a full-scale Italian landfill by the USEPA LandGEM model and the Andreottola-Cossu approach. The landfill chosen for this research ('Il Fossetto' plant, Monsummano Terme, Italy) had accepted mixed municipal raw waste for about 15 years. In the year 2003 a mechanical biological treatment (MBT) was implemented and starting from the end of the year 2006, the recirculation in the landfill of the concentrated leachate coming from the internal membrane leachate treatment plant was put into practice. The USEPA LandGEM model and the Andreottola-Cossu approach were chosen since they require only input data routinely acquired during landfill management (waste amount and composition) and allow a simplified calibration, therefore they are potentially useful for practical purposes such as landfill gas management. The results given by the models are compared with measured data and analysed in order to verify the impact of MBT on biogas production; moreover, the possible effects of the recirculation of the concentrated leachate are discussed. The results clearly show how both models can adequately fit measured data even after MBT implementation. Model performance was significantly reduced for the period after the beginning of recirculation of concentrated leachate when the probable inhibition of methane production, due to the competition between methanogens and sulfate-reducing bacteria, significantly influenced the biogas production and composition. PMID:21930528

  4. Adolescents' unconditional acceptance by parents and teachers and educational outcomes: A structural model of gender differences.

    PubMed

    Makri-Botsari, Evi

    2015-08-01

    The purpose of this study was to detect gender specific patterns in the network of relations between unconditionality of parental and teacher acceptance in the form of unconditional positive regard and a range of educational outcomes, as indexed by academic self-perception, academic intrinsic motivation, and academic achievement. To test the role of gender as a moderator, a multi-group analysis was employed within the framework of structural equation modelling with increasing restrictions placed on the structural paths across genders. The results on a sample of 427 adolescents in grades 7-9 showed that conditionality of acceptance undermined level of perceived acceptance for both social agents. Moreover, unconditionality of teacher acceptance exerted stronger influences on students' educational outcomes than unconditionality of parental acceptance, with effect sizes being larger for girls than for boys. PMID:26057875

  5. Utility-Scale Parabolic Trough Solar Systems: Performance Acceptance Test Guidelines, April 2009 - December 2010

    SciTech Connect

    Kearney, D.

    2011-05-01

    Prior to commercial operation, large solar systems in utility-size power plants need to pass a performance acceptance test conducted by the engineering, procurement, and construction (EPC) contractor or owners. In lieu of the present absence of ASME or other international test codes developed for this purpose, the National Renewable Energy Laboratory has undertaken the development of interim guidelines to provide recommendations for test procedures that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. The Guidelines contained here are specifically written for parabolic trough collector systems with a heat-transport system using a high-temperature synthetic oil, but the basic principles are relevant to other CSP systems.

  6. Acceptance Performance Test Guideline for Utility Scale Parabolic Trough and Other CSP Solar Thermal Systems: Preprint

    SciTech Connect

    Mehos, M. S.; Wagner, M. J.; Kearney, D. W.

    2011-08-01

    Prior to commercial operation, large solar systems in utility-size power plants need to pass a performance acceptance test conducted by the engineering, procurement, and construction (EPC) contractor or owners. In lieu of the present absence of ASME or other international test codes developed for this purpose, the National Renewable Energy Laboratory has undertaken the development of interim guidelines to provide recommendations for test procedures that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. Progress on interim guidelines was presented at SolarPACES 2010. Significant additions and modifications were made to the guidelines since that time, resulting in a final report published by NREL in April 2011. This paper summarizes those changes, which emphasize criteria for assuring thermal equilibrium and steady state conditions within the solar field.

  7. Discrete carbon nanotubes increase lead acid battery charge acceptance and performance

    NASA Astrophysics Data System (ADS)

    Swogger, Steven W.; Everill, Paul; Dubey, D. P.; Sugumaran, Nanjan

    2014-09-01

    Performance demands placed upon lead acid batteries have outgrown the technology's ability to deliver. These demands, typically leading to Negative Active Material (NAM) failure, include: short, high-current surges; prolonged, minimal, overvoltage charging; repeated, Ah deficit charging; and frequent deep discharges. Research shows these failure mechanisms are attenuated by inclusion of carbon allotropes into the NAM. Addition of significant quantities of carbon, however, produces detrimental changes in paste rheology, leading to lowered industrial throughput. Additionally, capacity, cold-cranking performance, and other battery metrics are negatively affected at high carbon loads. Presented here is Molecular Rebar® Lead Negative, a new battery additive comprising discrete carbon nanotubes (dCNT) which uniformly disperse within battery pastes during mixing. NS40ZL batteries containing dCNT show enhanced charge acceptance, reserve capacity, and cold-cranking performance, decreased risk of polarization, and no detrimental changes to paste properties, when compared to dCNT-free controls. This work focuses on the dCNT as NAM additives only, but early-stage research is underway to test their functionality as a PAM additive. Batteries infused with Molecular Rebar® Lead Negative address the needs of modern lead acid battery applications, produce none of the detrimental side effects associated with carbon additives, and require no change to existing production lines.

  8. A proposed model of factors influencing hydrogen fuel cell vehicle acceptance

    NASA Astrophysics Data System (ADS)

    Imanina, N. H. Noor; Kwe Lu, Tan; Fadhilah, A. R.

    2016-03-01

    Issues such as environmental problem and energy insecurity keep worsening as a result of energy use from household to huge industries including automotive industry. Recently, a new type of zero emission vehicle, hydrogen fuel cell vehicle (HFCV) has received attention. Although there are argues on the feasibility of hydrogen as the future fuel, there is another important issue, which is the acceptance of HFCV. The study of technology acceptance in the early stage is a vital key for a successful introduction and penetration of a technology. This paper proposes a model of factors influencing green vehicle acceptance, specifically HFCV. This model is built base on two technology acceptance theories and other empirical studies of vehicle acceptance. It aims to provide a base for finding the key factors influencing new sustainable energy fuelled vehicle, HFCV acceptance which is achieved by explaining intention to accept HFCV. Intention is influenced by attitude, subjective norm and perceived behavioural control from Theory of Planned Behaviour and personal norm from Norm Activation Theory. In the framework, attitude is influenced by perceptions of benefits and risks, and social trust. Perceived behavioural control is influenced by government interventions. Personal norm is influenced by outcome efficacy and problem awareness.

  9. A Multivariate Model for the Study of Parental Acceptance-Rejection and Child Abuse.

    ERIC Educational Resources Information Center

    Rohner, Ronald P.; Rohner, Evelyn C.

    This paper proposes a multivariate strategy for the study of parental acceptance-rejection and child abuse and describes a research study on parental rejection and child abuse which illustrates the advantages of using a multivariate, (rather than a simple-model) approach. The multivariate model is a combination of three simple models used to study…

  10. Utilizing the health belief model to assess vaccine acceptance of patients on hemodialysis.

    PubMed

    Adams, Angela; Hall, Mellisa; Fulghum, Janis

    2014-01-01

    Vaccine rates in patients on hemodialysis are substantially lower than the Healthy People 2020 targets. The purpose of this study is to utilize the perceptions and cues for action constructs of the Health Belief Model (HBM) to assess the attitudes of patients receiving outpatient hemodialysis regarding acceptance of the seasonal influenza, pneumococcal, and hepatitis B virus vaccines. Vaccine acceptance is defined as receiving the vaccine. Study findings suggest age, perceived susceptibility, and perceived severity increase the odds of getting some vaccines. Findings have implications for the development of patient education materials, interdisciplinary team assessments, and plan of care strategies to increase vaccine acceptance. PMID:25244894

  11. THE TECHNOLOGY ACCEPTANCE MODEL: ITS PAST AND ITS FUTURE IN HEALTH CARE

    PubMed Central

    HOLDEN, RICHARD J.; KARSH, BEN-TZION

    2009-01-01

    Increasing interest in end users’ reactions to health information technology (IT) has elevated the importance of theories that predict and explain health IT acceptance and use. This paper reviews the application of one such theory, the Technology Acceptance Model (TAM), to health care. We reviewed 16 data sets analyzed in over 20 studies of clinicians using health IT for patient care. Studies differed greatly in samples and settings, health ITs studied, research models, relationships tested, and construct operationalization. Certain TAM relationships were consistently found to be significant, whereas others were inconsistent. Several key relationships were infrequently assessed. Findings show that TAM predicts a substantial portion of the use or acceptance of health IT, but that the theory may benefit from several additions and modifications. Aside from improved study quality, standardization, and theoretically motivated additions to the model, an important future direction for TAM is to adapt the model specifically to the health care context, using beliefs elicitation methods. PMID:19615467

  12. Modeling the acceptance of clinical information systems among hospital medical staff: an extended TAM model.

    PubMed

    Melas, Christos D; Zampetakis, Leonidas A; Dimopoulou, Anastasia; Moustakis, Vassilis

    2011-08-01

    Recent empirical research has utilized the Technology Acceptance Model (TAM) to advance the understanding of doctors' and nurses' technology acceptance in the workplace. However, the majority of the reported studies are either qualitative in nature or use small convenience samples of medical staff. Additionally, in very few studies moderators are either used or assessed despite their importance in TAM based research. The present study focuses on the application of TAM in order to explain the intention to use clinical information systems, in a random sample of 604 medical staff (534 physicians) working in 14 hospitals in Greece. We introduce physicians' specialty as a moderator in TAM and test medical staff's information and communication technology (ICT) knowledge and ICT feature demands, as external variables. The results show that TAM predicts a substantial proportion of the intention to use clinical information systems. Findings make a contribution to the literature by replicating, explaining and advancing the TAM, whereas theory is benefited by the addition of external variables and medical specialty as a moderator. Recommendations for further research are discussed. PMID:21292029

  13. Extending the Technology Acceptance Model to Explore the Intention to Use Second Life for Enhancing Healthcare Education

    ERIC Educational Resources Information Center

    Chow, Meyrick; Herold, David Kurt; Choo, Tat-Ming; Chan, Kitty

    2012-01-01

    Learners need to have good reasons to engage and accept e-learning. They need to understand that unless they do, the outcomes will be less favourable. The technology acceptance model (TAM) is the most widely recognized model addressing why users accept or reject technology. This study describes the development and evaluation of a virtual…

  14. Testing a developmental cascade model of emotional and social competence and early peer acceptance

    PubMed Central

    Blandon, Alysia Y.; Calkins, Susan D.; Grimm, Kevin J.; Keane, Susan P.; O’Brien, Marion

    2011-01-01

    A developmental cascade model of early emotional and social competence predicting later peer acceptance was examined in a community sample of 440 children across the ages of 2 to 7. Children’s externalizing behavior, emotion regulation, social skills within the classroom and peer acceptance were examined utilizing a multitrait-multimethod approach. A series of longitudinal cross-lag models that controlled for shared rater variance were fit using structural equation modeling. Results indicated there was considerable stability in children’s externalizing behavior problems and classroom social skills over time. Contrary to expectations, there were no reciprocal influences between externalizing behavior problems and emotion regulation, though higher levels of emotion regulation were associated with decreases in subsequent levels of externalizing behaviors. Finally, children’s early social skills also predicted later peer acceptance. Results underscore the complex associations among emotional and social functioning across early childhood. PMID:20883578

  15. Stochastic optimization model for order acceptance with multiple demand classes and uncertain demand/supply

    NASA Astrophysics Data System (ADS)

    Yang, Wen; Fung, Richard Y. K.

    2014-06-01

    This article considers an order acceptance problem in a make-to-stock manufacturing system with multiple demand classes in a finite time horizon. Demands in different periods are random variables and are independent of one another, and replenishments of inventory deviate from the scheduled quantities. The objective of this work is to maximize the expected net profit over the planning horizon by deciding the fraction of the demand that is going to be fulfilled. This article presents a stochastic order acceptance optimization model and analyses the existence of the optimal promising policies. An example of a discrete problem is used to illustrate the policies by applying the dynamic programming method. In order to solve the continuous problems, a heuristic algorithm based on stochastic approximation (HASA) is developed. Finally, the computational results of a case example illustrate the effectiveness and efficiency of the HASA approach, and make the application of the proposed model readily acceptable.

  16. The Impact of Trajectory Prediction Uncertainty on Air Traffic Controller Performance and Acceptability

    NASA Technical Reports Server (NTRS)

    Mercer, Joey S.; Bienert, Nancy; Gomez, Ashley; Hunt, Sarah; Kraut, Joshua; Martin, Lynne; Morey, Susan; Green, Steven M.; Prevot, Thomas; Wu, Minghong G.

    2013-01-01

    A Human-In-The-Loop air traffic control simulation investigated the impact of uncertainties in trajectory predictions on NextGen Trajectory-Based Operations concepts, seeking to understand when the automation would become unacceptable to controllers or when performance targets could no longer be met. Retired air traffic controllers staffed two en route transition sectors, delivering arrival traffic to the northwest corner-post of Atlanta approach control under time-based metering operations. Using trajectory-based decision-support tools, the participants worked the traffic under varying levels of wind forecast error and aircraft performance model error, impacting the ground automations ability to make accurate predictions. Results suggest that the controllers were able to maintain high levels of performance, despite even the highest levels of trajectory prediction errors.

  17. Predictive performance models and multiple task performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  18. A Model Performance

    ERIC Educational Resources Information Center

    Thornton, Bradley D.; Smalley, Robert A.

    2008-01-01

    Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…

  19. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 2 2012-04-01 2012-04-01 false Model code provisions for use in partially accepted code jurisdictions. 200.926c Section 200.926c Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR...

  20. Perceived Convenience in an Extended Technology Acceptance Model: Mobile Technology and English Learning for College Students

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Yan, Chi-Fang; Tseng, Ju-Shih

    2012-01-01

    Since convenience is one of the features for mobile learning, does it affect attitude and intention of using mobile technology? The technology acceptance model (TAM), proposed by David (1989), was extended with perceived convenience in the present study. With regard to English language mobile learning, the variables in the extended TAM and its…

  1. Extended TAM Model: Impacts of Convenience on Acceptance and Use of Moodle

    ERIC Educational Resources Information Center

    Hsu, Hsiao-hui; Chang, Yu-ying

    2013-01-01

    The increasing online access to courses, programs, and information has shifted the control and responsibility of learning process from instructors to learners. Learners' perceptions of and attitudes toward e-learning constitute a critical factor to the success of such system. The purpose of this study is to take TAM (technology acceptance model)…

  2. Examining the Factors That Contribute to Successful Database Application Implementation Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Nworji, Alexander O.

    2013-01-01

    Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…

  3. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Model code provisions for use in partially accepted code jurisdictions. 200.926c Section 200.926c Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR HOUSING-FEDERAL HOUSING COMMISSIONER, DEPARTMENT...

  4. An Investigation of the Integrated Model of User Technology Acceptance: Internet User Samples in Four Countries

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash; Cucchi, Alain

    2008-01-01

    National background of users may influence the process of technology acceptance. The present study explored this issue with the new, integrated technology use model proposed by Sun and Zhang (2006). Data were collected from samples of college students in India, Mauritius, Reunion Island, and United States. Questionnaire methodology and…

  5. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Model code provisions for use in partially accepted code jurisdictions. 200.926c Section 200.926c Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR HOUSING-FEDERAL HOUSING COMMISSIONER, DEPARTMENT...

  6. Understanding Student Teachers' Behavioural Intention to Use Technology: Technology Acceptance Model (TAM) Validation and Testing

    ERIC Educational Resources Information Center

    Wong, Kung-Teck; Osman, Rosma bt; Goh, Pauline Swee Choo; Rahmat, Mohd Khairezan

    2013-01-01

    This study sets out to validate and test the Technology Acceptance Model (TAM) in the context of Malaysian student teachers' integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA), and structural equation…

  7. Shuttle passenger couch. [design and performance of engineering model

    NASA Technical Reports Server (NTRS)

    Rosener, A. A.; Stephenson, M. L.

    1974-01-01

    Conceptual design and fabrication of a full scale shuttle passenger couch engineering model are reported. The model was utilized to verify anthropometric dimensions, reach dimensions, ingress/egress, couch operation, storage space, restraint locations, and crew acceptability. These data were then incorported in the design of the passenger couch verification model that underwent performance tests.

  8. ABSL 18650HC Lot Acceptance Test- Ensuring Consistency And Flight Performance

    NASA Astrophysics Data System (ADS)

    Buckle, Rachel; Thwaite, Carl

    2011-10-01

    ABSL manufactures Space batteries using commercial cells. As the battery sizing is based on test data from cells purchased up to 10 years ago, a rigorous programme of testing is carried out to ensure batch consistency, flight quality and any implications for sizing are found early. A Lot Acceptance Test (LAT) is carried out on each new batch of cells purchased. A selection of cells undergo tests on their build quality, environmental tolerance, safety devices and lifetime degradation. This process will be described, along with issues such as setting acceptance criteria, comparison of results between batches, and what happens if a batch fails.

  9. Athletic Performance and Social Behavior as Predictors of Peer Acceptance in Children Diagnosed With Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Lopez-Williams, Andy; Chacko, Anil; Wymbs, Brian T.; Fabiano, Gregory A.; Seymour, Karen E.; Gnagy, Elizabeth M.; Chronis, Andrea M.; Burrows-MacLean, Lisa; Pelham, William E.; Morris, Tracy L.

    2005-01-01

    Sixty-three children between ages 6 and 12 who were enrolled in a summer treatment program for children with attention-deficit/hyperactivity disorder (ADHD) participated in a study designed to measure the relationship between social behaviors, athletic performance, and peer acceptance. Children were assessed on sport-specific skills of three major…

  10. Understanding Intention to Use Electronic Information Resources: A Theoretical Extension of the Technology Acceptance Model (TAM)

    PubMed Central

    Tao, Donghua

    2008-01-01

    This study extended the Technology Acceptance Model (TAM) by examining the roles of two aspects of e-resource characteristics, namely, information quality and system quality, in predicting public health students’ intention to use e-resources for completing research paper assignments. Both focus groups and a questionnaire were used to collect data. Descriptive analysis, data screening, and Structural Equation Modeling (SEM) techniques were used for data analysis. The study found that perceived usefulness played a major role in determining students’ intention to use e-resources. Perceived usefulness and perceived ease of use fully mediated the impact that information quality and system quality had on behavior intention. The research model enriches the existing technology acceptance literature by extending TAM. Representing two aspects of e-resource characteristics provides greater explanatory information for diagnosing problems of system design, development, and implementation. PMID:18999300

  11. Social trust, risk perceptions and public acceptance of recycled water: testing a social-psychological model.

    PubMed

    Ross, Victoria L; Fielding, Kelly S; Louis, Winnifred R

    2014-05-01

    Faced with a severe drought, the residents of the regional city of Toowoomba, in South East Queensland, Australia were asked to consider a potable wastewater reuse scheme to supplement drinking water supplies. As public risk perceptions and trust have been shown to be key factors in acceptance of potable reuse projects, this research developed and tested a social-psychological model of trust, risk perceptions and acceptance. Participants (N = 380) were surveyed a few weeks before a referendum was held in which residents voted against the controversial scheme. Analysis using structural equation modelling showed that the more community members perceived that the water authority used fair procedures (e.g., consulting with the community and providing accurate information), the greater their sense of shared identity with the water authority. Shared social identity in turn influenced trust via increased source credibility, that is, perceptions that the water authority is competent and has the community's interest at heart. The findings also support past research showing that higher levels of trust in the water authority were associated with lower perceptions of risk, which in turn were associated with higher levels of acceptance, and vice versa. The findings have a practical application for improving public acceptance of potable recycled water schemes. PMID:24603028

  12. The acceptance of in silico models for REACH: Requirements, barriers, and perspectives

    PubMed Central

    2011-01-01

    In silico models have prompted considerable interest and debate because of their potential value in predicting the properties of chemical substances for regulatory purposes. The European REACH legislation promotes innovation and encourages the use of alternative methods, but in practice the use of in silico models is still very limited. There are many stakeholders influencing the regulatory trajectory of quantitative structure-activity relationships (QSAR) models, including regulators, industry, model developers and consultants. Here we outline some of the issues and challenges involved in the acceptance of these methods for regulatory purposes. PMID:21982269

  13. Adding Innovation Diffusion Theory to the Technology Acceptance Model: Supporting Employees' Intentions to Use E-Learning Systems

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Hsu, Chia-Ning

    2011-01-01

    This study intends to investigate factors affecting business employees' behavioral intentions to use the e-learning system. Combining the innovation diffusion theory (IDT) with the technology acceptance model (TAM), the present study proposes an extended technology acceptance model. The proposed model was tested with data collected from 552…

  14. The Effects of a Modified Treatment Package with and without Feeder Modeling on One Child's Acceptance of Novel Foods

    ERIC Educational Resources Information Center

    Seiverling, Laura; Harclerode, Whitney; Williams, Keith

    2014-01-01

    The purpose of this study was to examine if sequential presentation with feeder modeling would lead to an increase in bites accepted of new foods compared to sequential presentation without feeder modeling in a typically developing 4-year-old boy with food selectivity. The participant's acceptance of novel foods increased both in the modeling and…

  15. Integrating Health Belief Model and Technology Acceptance Model: An Investigation of Health-Related Internet Use

    PubMed Central

    2015-01-01

    Background Today, people use the Internet to satisfy health-related information and communication needs. In Malaysia, Internet use for health management has become increasingly significant due to the increase in the incidence of chronic diseases, in particular among urban women and their desire to stay healthy. Past studies adopted the Technology Acceptance Model (TAM) and Health Belief Model (HBM) independently to explain Internet use for health-related purposes. Although both the TAM and HBM have their own merits, independently they lack the ability to explain the cognition and the related mechanism in which individuals use the Internet for health purposes. Objective This study aimed to examine the influence of perceived health risk and health consciousness on health-related Internet use based on the HBM. Drawing on the TAM, it also tested the mediating effects of perceived usefulness of the Internet for health information and attitude toward Internet use for health purposes for the relationship between health-related factors, namely perceived health risk and health consciousness on health-related Internet use. Methods Data obtained for the current study were collected using purposive sampling; the sample consisted of women in Malaysia who had Internet access. The partial least squares structural equation modeling method was used to test the research hypotheses developed. Results Perceived health risk (β=.135, t 1999=2.676) and health consciousness (β=.447, t 1999=9.168) had a positive influence on health-related Internet use. Moreover, perceived usefulness of the Internet and attitude toward Internet use for health-related purposes partially mediated the influence of health consciousness on health-related Internet use (β=.025, t 1999=3.234), whereas the effect of perceived health risk on health-related Internet use was fully mediated by perceived usefulness of the Internet and attitude (β=.029, t 1999=3.609). These results suggest the central role of

  16. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  17. An Empirical Assessment of a Technology Acceptance Model for Apps in Medical Education.

    PubMed

    Briz-Ponce, Laura; García-Peñalvo, Francisco José

    2015-11-01

    The evolution and the growth of mobile applications ("apps") in our society is a reality. This general trend is still upward and the app use has also penetrated the medical education community. However, there is a lot of unawareness of the students' and professionals' point of view about introducing "apps" within Medical School curriculum. The aim of this research is to design, implement and verify that the Technology Acceptance Model (TAM) can be employed to measure and explain the acceptance of mobile technology and "apps" within Medical Education. The methodology was based on a survey distributed to students and medical professionals from University of Salamanca. This model explains 46.7% of behavioral intention to use mobile devise or "apps" for learning and will help us to justify and understand the current situation of introducing "apps" into the Medical School curriculum. PMID:26411928

  18. Results of an emergency response atmospheric dispersion model comparison using a state accepted statistical protocol

    SciTech Connect

    Ciolek, J.T. Jr.

    1993-10-01

    The Rocky Flats Plant, located approximately 26 km northwest of downtown Denver, Colorado, has developed an emergency response atmospheric dispersion model for complex terrain applications. Plant personnel would use the model, known as the Terrain-Responsive Atmospheric Code (TRAC) (Hodgin 1985) to project plume impacts and provide off-site protective action recommendations to the State of Colorado should a hazardous material release occur from the facility. The Colorado Department of Health (CDH) entered into an interagency agreement with the Rocky Flats Plant prime contractor, EG&G Rocky Flats, and the US Department of Energy to evaluate TRAC as an acceptable emergency response tool. After exhaustive research of similar evaluation processes from other emergency response and regulatory organizations, the interagency committee devised a formal acceptance process. The process contains an evaluation protocol (Hodgin and Smith 1992), descriptions of responsibilities, an identified experimental data set to use in the evaluation, and judgment criteria for model acceptance. The evaluation protocol is general enough to allow for different implementations. This paper explains one implementation, shows protocol results for a test case, and presents results of a comparison between versions of TRAC with different wind Field codes: a two dimensional mass consistent code called WINDS (Fosberg et al. 1976) that has been extended to three dimensions, and a fully 3 dimensional mass conserving code called NUATMOS (Ross and Smith 1987, Ross et al. 1988).

  19. Applying the Extended Technology Acceptance Model to the Use of Clickers in Student Learning: Some Evidence from Macroeconomics Classes

    ERIC Educational Resources Information Center

    Wu, Xiaoyu; Gao, Yuan

    2011-01-01

    This paper applies the extended technology acceptance model (exTAM) in information systems research to the use of clickers in student learning. The technology acceptance model (TAM) posits that perceived ease of use and perceived usefulness of technology influence users' attitudes toward using and intention to use technology. Research subsequent…

  20. The Acceptance Model of Intuitive Eating: A Comparison of Women in Emerging Adulthood, Early Adulthood, and Middle Adulthood

    ERIC Educational Resources Information Center

    Augustus-Horvath, Casey L.; Tylka, Tracy L.

    2011-01-01

    The acceptance model of intuitive eating (Avalos & Tylka, 2006) posits that body acceptance by others helps women appreciate their body and resist adopting an observer's perspective of their body, which contribute to their eating intuitively/adaptively. We extended this model by integrating body mass index (BMI) into its structure and…

  1. Theory development in nursing and healthcare informatics: a model explaining and predicting information and communication technology acceptance by healthcare consumers.

    PubMed

    An, Ji-Young; Hayman, Laura L; Panniers, Teresa; Carty, Barbara

    2007-01-01

    About 110 million American adults are looking for health information and services on the Internet. Identification of the factors influencing healthcare consumers' technology acceptance is requisite to understanding their acceptance and usage behavior of online health information and related services. The purpose of this article is to describe the development of the Information and Communication Technology Acceptance Model (ICTAM). From the literature reviewed, ICTAM was developed with emphasis on integrating multidisciplinary perspectives from divergent frameworks and empirical findings into a unified model with regard to healthcare consumers' acceptance and usage behavior of information and services on the Internet. PMID:17703115

  2. Multiprocessor performance modeling with ADAS

    NASA Technical Reports Server (NTRS)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  3. Measuring the Moderating Effect of Gender and Age on E-Learning Acceptance in England: A Structural Equation Modeling Approach for an Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Tarhini, Ali; Hone, Kate; Liu, Xiaohui

    2014-01-01

    The success of an e-learning intervention depends to a considerable extent on student acceptance and use of the technology. Therefore, it has become imperative for practitioners and policymakers to understand the factors affecting the user acceptance of e-learning systems in order to enhance the students' learning experience. Based on an extended…

  4. TPF-C Performance Modeling

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart

    2008-01-01

    This slide presentation reviews the performance modeling of the Terrestrial Planet Finder Coronagraph (TPF-C). Included is a chart of the Error Budget Models, definitions of the static and dynamic terms, a chart showing the aberration sensitivity at 2 lambda/D, charts showing the thermal performance models and analysis, surface requirements, high-level requirements, and calculations for the beam walk model.Also included is a description of the control systems, and a flow for the iterative design and analysis cycle.

  5. Development of a prediction model on the acceptance of electronic laboratory notebooks in academic environments.

    PubMed

    Kloeckner, Frederik; Farkas, Robert; Franken, Tobias; Schmitz-Rode, Thomas

    2014-04-01

    Documentation of research data plays a key role in the biomedical engineering innovation processes. It makes an important contribution to the protection of intellectual property, the traceability of results and fulfilling the regulatory requirement. Because of the increasing digitalization in laboratories, an electronic alternative to the commonly-used paper-bound notebooks could contribute to the production of sophisticated documentation. However, compared to in an industrial environment, the use of electronic laboratory notebooks is not widespread in academic laboratories. Little is known about the acceptance of an electronic documentation system and the underlying reasons for this. Thus, this paper aims to establish a prediction model on the potential preference and acceptance of scientists either for paper-based or electronic documentation. The underlying data for the analysis originate from an online survey of 101 scientists in industrial, academic and clinical environments. Various parameters were analyzed to identify crucial factors for the system preference using binary logistic regression. The analysis showed significant dependency between the documentation system preference and the supposed workload associated with the documentation system (p<0.006; odds ratio=58.543) and an additional personal component. Because of the dependency of system choice on specific parameters it is possible to predict the acceptance of an electronic laboratory notebook before implementation. PMID:24225123

  6. Computational model of collective nest selection by ants with heterogeneous acceptance thresholds

    PubMed Central

    Masuda, Naoki; O'shea-Wheller, Thomas A.; Doran, Carolina; Franks, Nigel R.

    2015-01-01

    Collective decision-making is a characteristic of societies ranging from ants to humans. The ant Temnothorax albipennis is known to use quorum sensing to collectively decide on a new home; emigration to a new nest site occurs when the number of ants favouring the new site becomes quorate. There are several possible mechanisms by which ant colonies can select the best nest site among alternatives based on a quorum mechanism. In this study, we use computational models to examine the implications of heterogeneous acceptance thresholds across individual ants in collective nest choice behaviour. We take a minimalist approach to develop a differential equation model and a corresponding non-spatial agent-based model. We show, consistent with existing empirical evidence, that heterogeneity in acceptance thresholds is a viable mechanism for efficient nest choice behaviour. In particular, we show that the proposed models show speed–accuracy trade-offs and speed–cohesion trade-offs when we vary the number of scouts or the quorum threshold. PMID:26543578

  7. Computational model of collective nest selection by ants with heterogeneous acceptance thresholds.

    PubMed

    Masuda, Naoki; O'shea-Wheller, Thomas A; Doran, Carolina; Franks, Nigel R

    2015-06-01

    Collective decision-making is a characteristic of societies ranging from ants to humans. The ant Temnothorax albipennis is known to use quorum sensing to collectively decide on a new home; emigration to a new nest site occurs when the number of ants favouring the new site becomes quorate. There are several possible mechanisms by which ant colonies can select the best nest site among alternatives based on a quorum mechanism. In this study, we use computational models to examine the implications of heterogeneous acceptance thresholds across individual ants in collective nest choice behaviour. We take a minimalist approach to develop a differential equation model and a corresponding non-spatial agent-based model. We show, consistent with existing empirical evidence, that heterogeneity in acceptance thresholds is a viable mechanism for efficient nest choice behaviour. In particular, we show that the proposed models show speed-accuracy trade-offs and speed-cohesion trade-offs when we vary the number of scouts or the quorum threshold. PMID:26543578

  8. A Mindfulness-Acceptance-Commitment-Based Approach to Athletic Performance Enhancement: Theoretical Considerations

    ERIC Educational Resources Information Center

    Gardner, Frank L.; Moore, Zella E.

    2004-01-01

    While traditional cognitive-behavioral skills-training-based approaches to athletic performance enhancement posit that negative thoughts and emotions must be controlled, eliminated, or replaced for athlete-clients to perform optimally, recent evidence suggests that efforts to control, eliminate, or suppress these internal states may actually have…

  9. An Investigation of Employees' Use of E-Learning Systems: Applying the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Chen, Yen-Hsun

    2013-01-01

    The purpose of this study is to apply the technology acceptance model to examine the employees' attitudes and acceptance of electronic learning (e-learning) systems in organisations. This study examines four factors (organisational support, computer self-efficacy, prior experience and task equivocality) that are believed to influence…

  10. An Elaboration Likelihood Model Based Longitudinal Analysis of Attitude Change during the Process of IT Acceptance via Education Program

    ERIC Educational Resources Information Center

    Lee, Woong-Kyu

    2012-01-01

    The principal objective of this study was to gain insight into attitude changes occurring during IT acceptance from the perspective of elaboration likelihood model (ELM). In particular, the primary target of this study was the process of IT acceptance through an education program. Although the Internet and computers are now quite ubiquitous, and…

  11. An Exploration of Student Internet Use in India: The Technology Acceptance Model and the Theory of Planned Behaviour

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash

    2005-01-01

    Purpose: The purpose of this paper is to explore behavioral processes involved in internet technology acceptance and use with a sample in India, a developing country that can potentially benefit from greater participation in the web economy. Design/methodology/approach - User experience was incorporated into the technology acceptance model (TAM)…

  12. Nickel cadmium battery performance modelling

    NASA Technical Reports Server (NTRS)

    Clark, K.; Halpert, G.; Timmerman, P.

    1989-01-01

    The development of a model to predict cell/battery behavior given databases of temperature is described. The model accommodates batteries of various structural as well as thermal designs. Cell internal design modifications can be accommodated as long as the databases reflect the cell's performance characteristics. Operational parameters can be varied to simulate any number of charge or discharge methods under any orbital regime. The flexibility of the model stems from the broad scope of input variables and allows the prediction of battery performance under simulated mission or test conditions.

  13. Testing the Electronic Personal Health Record Acceptance Model by Nurses for Managing Their Own Health

    PubMed Central

    Trinkoff, A.M.; Storr, C.L.; Wilson, M.L.; Gurses, A.P.

    2015-01-01

    Summary Background To our knowledge, no evidence is available on health care professionals’ use of electronic personal health records (ePHRs) for their health management. We therefore focused on nurses’ personal use of ePHRs using a modified technology acceptance model. Objectives To examine (1) the psychometric properties of the ePHR acceptance model, (2) the associations of perceived usefulness, ease of use, data privacy and security protection, and perception of self as health-promoting role models to nurses’ own ePHR use, and (3) the moderating influences of age, chronic illness and medication use, and providers’ use of electronic health record (EHRs) on the associations between the ePHR acceptance constructs and ePHR use. Methods A convenience sample of registered nurses, those working in one of 12 hospitals in the Maryland and Washington, DC areas and members of the nursing informatics community (AMIA and HIMSS), were invited to respond to an anonymous online survey; 847 responded. Multiple logistic regression identified associations between the model constructs and ePHR use, and the moderating effect. Results Overall, ePHRs were used by 47%. Sufficient reliability for all scales was found. Three constructs were significantly related to nurses’ own ePHR use after adjusting for covariates: usefulness, data privacy and security protection, and health-promoting role model. Nurses with providers that used EHRs who perceived a higher level of data privacy and security protection had greater odds of ePHR use than those whose providers did not use EHRs. Older nurses with a higher self-perception as health-promoting role models had greater odds of ePHR use than younger nurses. Conclusions Nurses who use ePHRs for their personal health might promote adoption by the general public by serving as health-promoting role models. They can contribute to improvements in patient education and ePHR design, and serve as crucial resources when working with their

  14. Air Conditioner Compressor Performance Model

    SciTech Connect

    Lu, Ning; Xie, YuLong; Huang, Zhenyu

    2008-09-05

    During the past three years, the Western Electricity Coordinating Council (WECC) Load Modeling Task Force (LMTF) has led the effort to develop the new modeling approach. As part of this effort, the Bonneville Power Administration (BPA), Southern California Edison (SCE), and Electric Power Research Institute (EPRI) Solutions tested 27 residential air-conditioning units to assess their response to delayed voltage recovery transients. After completing these tests, different modeling approaches were proposed, among them a performance modeling approach that proved to be one of the three favored for its simplicity and ability to recreate different SVR events satisfactorily. Funded by the California Energy Commission (CEC) under its load modeling project, researchers at Pacific Northwest National Laboratory (PNNL) led the follow-on task to analyze the motor testing data to derive the parameters needed to develop a performance models for the single-phase air-conditioning (SPAC) unit. To derive the performance model, PNNL researchers first used the motor voltage and frequency ramping test data to obtain the real (P) and reactive (Q) power versus voltage (V) and frequency (f) curves. Then, curve fitting was used to develop the P-V, Q-V, P-f, and Q-f relationships for motor running and stalling states. The resulting performance model ignores the dynamic response of the air-conditioning motor. Because the inertia of the air-conditioning motor is very small (H<0.05), the motor reaches from one steady state to another in a few cycles. So, the performance model is a fair representation of the motor behaviors in both running and stalling states.

  15. Next generation imager performance model

    NASA Astrophysics Data System (ADS)

    Teaney, Brian; Reynolds, Joseph

    2010-04-01

    The next generation of Army imager performance models is currently under development at NVESD. The aim of this new model is to provide a flexible and extensible engineering tool for system design which encapsulates all of the capabilities of the existing Night Vision model suite (NVThermIP, SSCamIP, etc) along with many new design tools and features including a more intuitive interface, the ability to perform trade studies, and a library of standard and user generated components. By combining the previous model architectures in one interface the new design is better suited to capture emerging technologies such as fusion and new sensor modalities. In this paper we will describe the general structure of the model and some of its current capabilities along with future development plans.

  16. Development of a Health Information Technology Acceptance Model Using Consumers’ Health Behavior Intention

    PubMed Central

    2012-01-01

    Background For effective health promotion using health information technology (HIT), it is mandatory that health consumers have the behavioral intention to measure, store, and manage their own health data. Understanding health consumers’ intention and behavior is needed to develop and implement effective and efficient strategies. Objective To develop and verify the extended Technology Acceptance Model (TAM) in health care by describing health consumers’ behavioral intention of using HIT. Methods This study used a cross-sectional descriptive correlational design. We extended TAM by adding more antecedents and mediating variables to enhance the model’s explanatory power and to make it more applicable to health consumers’ behavioral intention. Additional antecedents and mediating variables were added to the hypothetical model, based on their theoretical relevance, from the Health Belief Model and theory of planned behavior, along with the TAM. We undertook structural equation analysis to examine the specific nature of the relationship involved in understanding consumers’ use of HIT. Study participants were 728 members recruited from three Internet health portals in Korea. Data were collected by a Web-based survey using a structured self-administered questionnaire. Results The overall fitness indices for the model developed in this study indicated an acceptable fit of the model. All path coefficients were statistically significant. This study showed that perceived threat, perceived usefulness, and perceived ease of use significantly affected health consumers’ attitude and behavioral intention. Health consumers’ health status, health belief and concerns, subjective norm, HIT characteristics, and HIT self-efficacy had a strong indirect impact on attitude and behavioral intention through the mediators of perceived threat, perceived usefulness, and perceived ease of use. Conclusions An extended TAM in the HIT arena was found to be valid to describe health

  17. INFLUENCE OF FORAGE SPECIES ON PASTURE PERFORMANCE, CARCASS QUALITY, AND CONSUMER ACCEPTABILITY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    British-type steers of predominantly Angus breeding were used to determine the influence of forage species fed during the final 30 to 45 days of finishing on performance, carcass characteristics, and meat quality. Finishing treatments included: 1) Mixed cool season pasture [bluegrass, orchardgrass,...

  18. The Development of Accepted Performance Items to Demonstrate Competence in Literary Braille

    ERIC Educational Resources Information Center

    Lewis, Sandra; D'Andrea, Frances Mary; Rosenblum, L. Penny

    2012-01-01

    Introduction: This research attempted to establish the content validity of several performance statements that are associated with basic knowledge, production, and reading of braille by beginning teachers. Methods: University instructors (n = 21) and new teachers of students with visual impairments (n = 20) who had taught at least 2 braille…

  19. Post-Graduate Performance, an Academic Comparison Evaluating Situating Learning and Law School Acceptance Scores

    ERIC Educational Resources Information Center

    Traverse, Maria A.

    2012-01-01

    Research on post-graduate performance, pertaining to law school graduates, indicates that success in the legal profession is attributable to more than the theoretical content or cognitive knowledge obtained through educational curricula. Research suggests that the combination of creative and analytic thinking skills contributes to a higher rate of…

  20. MPD Thruster Performance Analytic Models

    NASA Astrophysics Data System (ADS)

    Gilland, James; Johnston, Geoffrey

    2003-01-01

    Magnetoplasmadynamic (MPD) thrusters are capable of accelerating quasi-neutral plasmas to high exhaust velocities using Megawatts (MW) of electric power. These characteristics make such devices worthy of consideration for demanding, far-term missions such as the human exploration of Mars or beyond. Assessment of MPD thrusters at the system and mission level is often difficult due to their status as ongoing experimental research topics rather than developed thrusters. However, in order to assess MPD thrusters' utility in later missions, some adequate characterization of performance, or more exactly, projected performance, and system level definition are required for use in analyses. The most recent physical models of self-field MPD thrusters have been examined, assessed, and reconfigured for use by systems and mission analysts. The physical models allow for rational projections of thruster performance based on physical parameters that can be measured in the laboratory. The models and their implications for the design of future MPD thrusters are presented.

  1. MPD Thruster Performance Analytic Models

    NASA Technical Reports Server (NTRS)

    Gilland, James; Johnston, Geoffrey

    2007-01-01

    Magnetoplasmadynamic (MPD) thrusters are capable of accelerating quasi-neutral plasmas to high exhaust velocities using Megawatts (MW) of electric power. These characteristics make such devices worthy of consideration for demanding, far-term missions such as the human exploration of Mars or beyond. Assessment of MPD thrusters at the system and mission level is often difficult due to their status as ongoing experimental research topics rather than developed thrusters. However, in order to assess MPD thrusters utility in later missions, some adequate characterization of performance, or more exactly, projected performance, and system level definition are required for use in analyses. The most recent physical models of self-field MPD thrusters have been examined, assessed, and reconfigured for use by systems and mission analysts. The physical models allow for rational projections of thruster performance based on physical parameters that can be measured in the laboratory. The models and their implications for the design of future MPD thrusters are presented.

  2. MPD Thruster Performance Analytic Models

    NASA Technical Reports Server (NTRS)

    Gilland, James; Johnston, Geoffrey

    2003-01-01

    Magnetoplasmadynamic (MPD) thrusters are capable of accelerating quasi-neutral plasmas to high exhaust velocities using Megawatts (MW) of electric power. These characteristics make such devices worthy of consideration for demanding, far-term missions such as the human exploration of Mars or beyond. Assessment of MPD thrusters at the system and mission level is often difficult due to their status as ongoing experimental research topics rather than developed thrusters. However, in order to assess MPD thrusters utility in later missions, some adequate characterization of performance, or more exactly, projected performance, and system level definition are required for use in analyses. The most recent physical models of self-field MPD thrusters have been examined, assessed, and reconfigured for use by systems and mission analysts. The physical models allow for rational projections of thruster performance based on physical parameters that can be measured in the laboratory. The models and their implications for the design of future MPD thrusters are presented.

  3. Modelling dose distribution in tubing and cable using CYLTRAN and ACCEPT Monte Carlo simulation code

    SciTech Connect

    Weiss, D.E.; Kensek, R.P.

    1993-12-31

    One of the difficulties in the irradiation of non-slab geometries, such as a tube, is the uneven penetration of the electrons. A simple model of the distribution of dose in a tube or cable in relationship to voltage, composition, wall thickness and diameter can be mapped using the cylinder geometry provided for in the ITS/CYLTRAN code, complete with automatic subzoning. The reality of more complex 3D geometry to include effects of window foil, backscattering fixtures and beam scanning angles can be more completely accounted for by using the ITS/ACCEPT code with a line source update and a system of intersecting wedges to define input zones for mapping dose distributions in a tube. Thus, all of the variables that affect dose distribution can be modelled without the need to run time consuming and costly factory experiments. The effects of composition changes on dose distribution can also be anticipated.

  4. Development of a Performance and Processing Property Acceptance Region for Cementitious Low-Level Waste Forms at Savannah River Site - 13174

    SciTech Connect

    Staub, Aaron V.; Reigel, Marissa M.

    2013-07-01

    The Saltstone Production and Disposal Facilities (SPF and SDF) at the Savannah River Site (SRS) have been treating decontaminated salt solution, a low-level aqueous waste stream (LLW) since facility commissioning in 1990. In 2012, the Saltstone Facilities implemented a new Performance Assessment (PA) that incorporates an alternate design for the disposal facility to ensure that the performance objectives of DOE Order 435.1 and the National Defense Authorization Act (NDAA) of Fiscal Year 2005 Section 3116 are met. The PA performs long term modeling of the waste form, disposal facility, and disposal site hydrogeology to determine the transport history of radionuclides disposed in the LLW. Saltstone has been successfully used to dispose of LLW in a grout waste form for 15 years. Numerous waste form property assumptions directly impact the fate and transport modeling performed in the PA. The extent of process variability and consequence on performance properties are critical to meeting the assumptions of the PA. The SPF has ensured performance property acceptability by way of implementing control strategies that ensure the process operates within the analyzed limits of variability, but efforts continue to improve the understanding of facility performance in relation to the PA analysis. A similar understanding of the impact of variability on processing parameters is important from the standpoint of the operability of the production facility. The fresh grout slurry properties (particularly slurry rheology and the rate of hydration and structure formation) of the waste form directly impact the pressure and flow rates that can be reliably processed. It is thus equally important to quantify the impact of variability on processing parameters to ensure that the design basis assumptions for the production facility are maintained. Savannah River Remediation (SRR) has been pursuing a process that will ultimately establish a property acceptance region (PAR) to incorporate

  5. Comparing Cognitive, Metacognitive, and Acceptance and Commitment Therapy Models of Depression: a Longitudinal Study Survey.

    PubMed

    Ruiz, Francisco J; Odriozola-González, Paula

    2015-01-01

    This study analyzed the interrelationships between key constructs of cognitive therapy (CT; depressogenic schemas), metacognitive therapy (MCT; dysfunctional metacognitive beliefs), and acceptance and commitment therapy (ACT; psychological inflexibility) in the prediction of depressive symptoms. With a lapse of nine months, 106 nonclinical participants responded twice to an anonymous online survey containing the following questionnaires: the Depression subscale of the Depression Anxiety and Stress Scales (DASS), the Dysfunctional Attitude Scale Revised (DAS-R), the Positive beliefs, Negative beliefs and Need to control subscales of the Metacognitions Questionnaire-30 (MCQ-30), and the Acceptance and Action Questionnaire - II (AAQ-II). Results showed that when controlling for baseline levels of depressive symptoms and demographic variables, psychological inflexibility longitudinally mediated the effect of depressogenic schemas (path ab = .023, SE = .010; 95% BC CI [.008, .048]) and dysfunctional metacognitive beliefs on depressive symptoms (positive metacognitive beliefs: path ab = .052, SE = .031; 95% BC CI [.005, .134]; negative metacognitive beliefs: path ab = .087, SE = .049; 95% BC CI [.016, .214]; need to control: path ab = .087, SE = .051; 95% BC CI [.013, .220]). Results are discussed emphasizing the role of psychological inflexibility in the CT and MCT models of depression. PMID:26076977

  6. Consumer acceptance and stability of spray dried betanin in model juices.

    PubMed

    Kaimainen, Mika; Laaksonen, Oskar; Järvenpää, Eila; Sandell, Mari; Huopalahti, Rainer

    2015-11-15

    Spray dried beetroot powder was used to colour model juices, and the consumer acceptance of the juices and stability of the colour during storage at 60 °C, 20 °C, 4 °C, and -20 °C were studied. The majority of the consumers preferred the model juices coloured with anthocyanins or beetroot extract over model juices coloured with spray dried beetroot powder. The consumers preferred more intensely coloured samples over lighter samples. Spray dried betanin samples were described as 'unnatural' and 'artificial' whereas the colour of beetroot extract was described more 'natural' and 'real juice'. No beetroot-derived off-odours or off-flavours were perceived in the model juices coloured with beetroot powder. Colour stability in model juices was greatly dependent on storage temperature with better stability at lower temperatures. Colour stability in the spray dried powder was very good at 20 °C. Betacyanins from beetroot could be a potential colourant for food products that are stored cold. PMID:25977043

  7. Environmental acceptability of high-performance alternatives for depleted uranium penetrators

    SciTech Connect

    Kerley, C.R.; Easterly, C.E.; Eckerman, K.F.

    1996-08-01

    The Army`s environmental strategy for investigating material substitution and management is to measure system environmental gains/losses in all phases of the material management life cycle from cradle to grave. This study is the first in a series of new investigations, applying material life cycle concepts, to evaluate whether there are environmental benefits from increasing the use of tungsten as an alternative to depleted uranium (DU) in Kinetic Energy Penetrators (KEPs). Current military armor penetrators use DU and tungsten as base materials. Although DU alloys have provided the highest performance of any high-density alloy deployed against enemy heavy armor, its low-level radioactivity poses a number of environmental risks. These risks include exposures to the military and civilian population from inhalation, ingestion, and injection of particles. Depleted uranium is well known to be chemically toxic (kidney toxicity), and workplace exposure levels are based on its renal toxicity. Waste materials containing DU fragments are classified as low-level radioactive waste and are regulated by the Nuclear Regulatory Commission. These characteristics of DU do not preclude its use in KEPs. However, long-term management challenges associated with KEP deployment and improved public perceptions about environmental risks from military activities might be well served by a serious effort to identify, develop, and substitute alternative materials that meet performance objectives and involve fewer environmental risks. Tungsten, a leading candidate base material for KEPS, is potentially such a material because it is not radioactive. Tungsten is less well studied, however, with respect to health impacts and other environmental risks. The present study is designed to contribute to the understanding of the environmental behavior of tungsten by synthesizing available information that is relevant to its potential use as a penetrator.

  8. Next Generation Balloon Performance Model

    NASA Astrophysics Data System (ADS)

    Pankine, A.; Nock, K.; Heun, M.; Schlaifer, S.

    Global Aerospace Corporation is developing a new trajectory and performance modeling tool for Earth and Planetary Balloons, called Navajo. This tool will advance the state of the art for balloon performance models and assist NASA and commercial balloon designers, campaign and mission planners, and flight operations staff by providing high-accuracy vertical and horizontal trajectory predictions. Nothing like Navajo currently exists. The Navajo design integrates environment, balloon (or Lighter Than Air - LTA), gondola (for ballast and communications), and trajectory control system submodels to provide rapid and exhaustive evaluation of vertical and horizontal balloon and LTA vehicle trajectories. The concept utilizes an extensible computer application architecture to permit definit ion of additional flight system components and environments. The Navajo architecture decouples the balloon performance and environment models so that users can swap balloon and environment models easily and assess the capabilities of new balloon technologies in a variety of environments. The Navajo design provides integrated capabilities for safety analysis for Earth balloon trajectories, and utilize improved thermal models. We report on our progress towards the development of Navajo.

  9. Evaluation of an Intelligent Tutoring System in Pathology: Effects of External Representation on Performance Gains, Metacognition, and Acceptance

    PubMed Central

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Tseytlin, Eugene; Roh, Ellen; Jukic, Drazen

    2007-01-01

    Objective Determine effects of computer-based tutoring on diagnostic performance gains, meta-cognition, and acceptance using two different problem representations. Describe impact of tutoring on spectrum of diagnostic skills required for task performance. Identify key features of student-tutor interaction contributing to learning gains. Design Prospective, between-subjects study, controlled for participant level of training. Resident physicians in two academic pathology programs spent four hours using one of two interfaces which differed mainly in external problem representation. The case-focused representation provided an open-learning environment in which students were free to explore evidence-hypothesis relationships within a case, but could not visualize the entire diagnostic space. The knowledge-focused representation provided an interactive representation of the entire diagnostic space, which more tightly constrained student actions. Measurements Metrics included results of pretest, post-test and retention-test for multiple choice and case diagnosis tests, ratios of performance to student reported certainty, results of participant survey, learning curves, and interaction behaviors during tutoring. Results Students had highly significant learning gains after one tutoring session. Learning was retained at one week. There were no differences between the two interfaces in learning gains on post-test or retention test. Only students in the knowledge-focused interface exhibited significant metacognitive gains from pretest to post-test and pretest to retention test. Students rated the knowledge-focused interface significantly higher than the case-focused interface. Conclusions Cognitive tutoring is associated with improved diagnostic performance in a complex medical domain. The effect is retained at one-week post-training. Knowledge-focused external problem representation shows an advantage over case-focused representation for metacognitive effects and user

  10. Where there's smoke: Cigarette use, social acceptability, and spatial approaches to multilevel modeling.

    PubMed

    O'Connell, Heather A

    2015-09-01

    I contribute to understandings of how context is related to individual outcomes by assessing the added value of combining multilevel and spatial modeling techniques. This methodological approach leads to substantive contributions to the smoking literature, including improved clarity on the central contextual factors and the examination of one manifestation of the social acceptability hypothesis. For this analysis I use restricted-use natality data from the Vital Statistics, and county-level data from the 2005-9 ACS. Critically, the results suggest that spatial considerations are still relevant in a multilevel framework. In addition, I argue that spatial processes help explain the relationships linking racial/ethnic minority concentration to lower overall odds of smoking. PMID:26188587

  11. WHERE THERE’S SMOKE: CIGARETTE USE, SOCIAL ACCEPTABILITY, AND SPATIAL APPROACHES TO MULTILEVEL MODELING

    PubMed Central

    O’Connell, Heather A.

    2015-01-01

    I contribute to understandings of how context is related to individual outcomes by assessing the added value of combining multilevel and spatial modeling techniques. This methodological approach leads to substantive contributions to the smoking literature, including improved clarity on the central contextual factors and the examination of one manifestation of the social acceptability hypothesis. For this analysis I use restricted-use natality data from the Vital Statistics, and county-level data from the 2005–9 ACS. Critically, the results suggest that spatial considerations are still relevant in a multilevel framework. In addition, I argue that spatial processes help explain the relationships linking racial/ethnic minority concentration to lower overall odds of smoking. PMID:26188587

  12. Flight Crew Workload, Acceptability, and Performance When Using Data Comm in a High-Density Terminal Area Simulation

    NASA Technical Reports Server (NTRS)

    Norman, R. Michael; Baxley, Brian T.; Adams, Cathy A.; Ellis, Kyle K. E.; Latorella, Kara A.; Comstock, James R., Jr.

    2013-01-01

    This document describes a collaborative FAA/NASA experiment using 22 commercial airline pilots to determine the effect of using Data Comm to issue messages during busy, terminal area operations. Four conditions were defined that span current day to future flight deck equipage: Voice communication only, Data Comm only, Data Comm with Moving Map Display, and Data Comm with Moving Map displaying taxi route. Each condition was used in an arrival and a departure scenario at Boston Logan Airport. Of particular interest was the flight crew response to D-TAXI, the use of Data Comm by Air Traffic Control (ATC) to send taxi instructions. Quantitative data was collected on subject reaction time, flight technical error, operational errors, and eye tracking information. Questionnaires collected subjective feedback on workload, situation awareness, and acceptability to the flight crew for using Data Comm in a busy terminal area. Results showed that 95% of the Data Comm messages were responded to by the flight crew within one minute and 97% of the messages within two minutes. However, post experiment debrief comments revealed almost unanimous consensus that two minutes was a reasonable expectation for crew response. Flight crews reported that Expected D-TAXI messages were useful, and employment of these messages acceptable at all altitude bands evaluated during arrival scenarios. Results also indicate that the use of Data Comm for all evaluated message types in the terminal area was acceptable during surface operations, and during arrivals at any altitude above the Final Approach Fix, in terms of response time, workload, situation awareness, and flight technical performance. The flight crew reported the use of Data Comm as implemented in this experiment as unacceptable in two instances: in clearances to cross an active runway, and D-TAXI messages between the Final Approach Fix and 80 knots during landing roll. Critical cockpit tasks and the urgency of out-the window scan made the

  13. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  14. The development of a model for predicting passenger acceptance of short-haul air transportation systems

    NASA Technical Reports Server (NTRS)

    Kuhlthau, A. R.; Jacobson, I. D.

    1977-01-01

    Meaningful criteria and methodology for assessing, particularly in the area of ride quality, the potential acceptability to the traveling public of present and future transportation systems were investigated. Ride quality was found to be one of the important variables affecting the decision of users of air transportation, and to be influenced by several environmental factors, especially motion, noise, pressure, temperature, and seating. Models were developed to quantify the relationship of subjective comfort to all of these parameters and then were exercised for a variety of situations. Passenger satisfaction was found to be strongly related to ride quality and was so modeled. A computer program was developed to assess the comfort and satisfaction levels of passengers on aircraft subjected to arbitrary flight profiles over arbitrary terrain. A model was deduced of the manner in which passengers integrate isolated segments of a flight to obtain an overall trip comfort rating. A method was established for assessing the influence of other links (e.g., access, terminal conditions) in the overall passenger trip.

  15. Performance.

    PubMed

    Chambers, David W

    2006-01-01

    High performance is difficult to maintain because it is dynamic and not well understood. Based on a synthesis of many sources, a model is proposed where performance is a function of the balance between capacity and challenge. Too much challenge produces coping (or a crash); excess capacity results in boredom. Over time, peak performance drifts toward boredom. Performance can be managed by adjusting our level of ability, our effort, the opportunity to perform, and the challenge we agree to take on. Coping, substandard but acceptable performance, is common among professionals and its long-term side effects can be debilitating. A crash occurs when coping mechanisms fail. PMID:17020177

  16. Coupled inverse and forward modelling to assess the range of acceptable thermal histories, a case study from SE Brazil

    NASA Astrophysics Data System (ADS)

    Cogné, N.; Gallagher, K.; Cobbold, P. R.

    2012-04-01

    We performed a new thermochronological study (fission track analysis and (U-Th)/He dating on apatite) in SE Brazil and integrate those data with inverse and forward modelling via QTQt software (Gallagher, 2012) to obtain thermal histories. The inversion results were used to characterize the general thermal histories and the associated uncertainties. For most of the samples we had a first phase of cooling during Late Cretaceous or Early Tertiary with subsequent reheating followed by Neogene cooling. The inverse modelling does not provide a unique solution and the associated uncertainties can be quite significant. Moreover the Tertiary parts of thermal histories were usually near the accepted resolution of the thermochronometric methods (~50-40°C). Therefore we performed deterministic forward modelling within the range of uncertainties to assess which solution is the most consistent with the data and independent geological information. These results are always conditional on the assumed kinetics for fission track annealing and diffusion of He, so we do not test the validity of that aspect. However, we can look at the range of predictions for the different forward models tested. This apporach implies that the reheating is required only for the samples around onshore Tertiary basins. For other samples we cannot conclude but geological information are against this hypothesis. However the Neogene cooling is required for all the samples.The combination of forward and inverse modelling allows us to better constrain the thermal histories for each sample in exploring the range of uncertainties and to reconcile a range of possible thermal histories with independent geological information. It also provides new information on the contrasting thermal evolution between different regions of the onshore SE Brazilian margin. Gallagher, K. 2012, Transdimensional Inverse thermal history modeling for quantitative thermochronology, Journal of Geophysical Research, in press.

  17. Preservice Teachers' Acceptance of ICT Integration in the Classroom: Applying the UTAUT Model

    ERIC Educational Resources Information Center

    Birch, A.; Irvine, V.

    2009-01-01

    In this study, the researchers explore the factors that influence preservice teachers' acceptance of information and communication technology (ICT) integration in the classroom. The Unified Theory of Acceptance and Use of Technology (UTAUT) was developed by Venkatesh et al. ["MIS Quarterly, 27"(3), 425-478] in 2003 and shown to outperform eight…

  18. Determinants of Intention to Use eLearning Based on the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Punnoose, Alfie Chacko

    2012-01-01

    The purpose of this study was to find some of the predominant factors that determine the intention of students to use eLearning in the future. Since eLearning is not just a technology acceptance decision but also involves cognition, this study extended its search beyond the normal technology acceptance variables into variables that could affect…

  19. A Quantitative Examination of User Experience as an Antecedent to Student Perception in Technology Acceptance Modeling

    ERIC Educational Resources Information Center

    Butler, Rory

    2013-01-01

    Internet-enabled mobile devices have increased the accessibility of learning content for students. Given the ubiquitous nature of mobile computing technology, a thorough understanding of the acceptance factors that impact a learner's intention to use mobile technology as an augment to their studies is warranted. Student acceptance of mobile…

  20. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    ERIC Educational Resources Information Center

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  1. Learning with Interactive Whiteboards: Determining the Factors on Promoting Interactive Whiteboards to Students by Technology Acceptance Model

    ERIC Educational Resources Information Center

    Kilic, Eylem; Güler, Çetin; Çelik, H. Eray; Tatli, Cemal

    2015-01-01

    Purpose: The purpose of this study is to investigate the factors which might affect the intention to use interactive whiteboards (IWBs) by university students, using Technology Acceptance Model by the structural equation modeling approach. The following hypothesis guided the current study: H1. There is a positive relationship between IWB…

  2. Physician Acceptance of a Physician-Pharmacist Collaborative Treatment Model for Hypertension Management in Primary Care.

    PubMed

    Smith, Steven M; Hasan, Michaela; Huebschmann, Amy G; Penaloza, Richard; Schorr-Ratzlaff, Wagner; Sieja, Amber; Roscoe, Nicholai; Trinkley, Katy E

    2015-09-01

    Physician-pharmacist collaborative care (PPCC) is effective in improving blood pressure (BP) control, but primary care provider (PCP) engagement in such models has not been well-studied. The authors analyzed data from PPCC referrals to 108 PCPs, for patients with uncontrolled hypertension, assessing the proportion of referral requests approved, disapproved, and not responded to, and reasons for disapproval. Of 2232 persons with uncontrolled hypertension, PPCC referral requests were sent for 1516 (67.9%): 950 (62.7%) were approved, 406 (26.8%) were disapproved, and 160 (10.6%) received no response. Approval rates differed widely by PCP with a median approval rate of 75% (interquartile range, 41%-100%). The most common reasons for disapproval were: PCP prefers to manage hypertension (19%), and BP controlled per PCP (18%); 8% of cases were considered too complex for PPCC. Provider acceptance of a PPCC hypertension clinic was generally high and sustained but varied widely among PCPs. No single reason for disapproval predominated. PMID:26032586

  3. Effect of annealing treatment on the performance of organic photovoltaic devices using SPFGraphene as electron-accepter material

    NASA Astrophysics Data System (ADS)

    Wang, HaiTeng; He, DaWei; Wang, YongSheng; Liu, ZhiYong; Wu, HongPeng; Wang, JiGang; Zhao, Yu

    2012-08-01

    We have researched the performances of organic photovoltaic devices with the bulk heterojunction (BHJ) structure using the organic solution-processable functionalized graphene (SPFGraphene) material as the electron-accepter material and P3OT as the donor material. The structural configuration of the device is ITO/PEDOT:PSS/P3OT:PCBM-SPFGraphene/LiF/Al. Given the P3OT/PCBM (1:1) mixture with 8wt% of SPFGraphene, the open-circuit voltage ( V oc) of the device reaches 0.64 V, a short-circuit current density ( J sc) reaches 5.7 mA/cm2, a fill factor ( FF) reaches 0.42, and the power conversion efficiency ( η) reaches 1.53% at illumination at 100 mW/cm2 AM1.5. We further studied the reason for the device performances improvement. In the P3OT:PCBM-SPFGraphene composite, the SPFGraphene material acts as exciton dissociation sites and provides the transport pathways of the lowest unoccupied molecular orbital (LUMO)-SPFGraphene-Al. Furthermore, adding SPFGraphene to P3OT results in appropriate energetic distance between the highest occupied molecular orbital (HOMO) and LUMO of the donor/acceptor and provides higher exciton dissociation volume mobility of carrier transport. We have researched the effect of annealing treatment for the devices and found that the devices with annealing treatment at 180°C show better performances compared with devices without annealed treatment. The devices with annealed treatment show the best performance, with an enhancement of the power conversion efficiency from 1.53% to 1.75%.

  4. Generic hypersonic vehicle performance model

    NASA Technical Reports Server (NTRS)

    Chavez, Frank R.; Schmidt, David K.

    1993-01-01

    An integrated computational model of a generic hypersonic vehicle was developed for the purpose of determining the vehicle's performance characteristics, which include the lift, drag, thrust, and moment acting on the vehicle at specified altitude, flight condition, and vehicular configuration. The lift, drag, thrust, and moment are developed for the body fixed coordinate system. These forces and moments arise from both aerodynamic and propulsive sources. SCRAMjet engine performance characteristics, such as fuel flow rate, can also be determined. The vehicle is assumed to be a lifting body with a single aerodynamic control surface. The body shape and control surface location are arbitrary and must be defined. The aerodynamics are calculated using either 2-dimensional Newtonian or modified Newtonian theory and approximate high-Mach-number Prandtl-Meyer expansion theory. Skin-friction drag was also accounted for. The skin-friction drag coefficient is a function of the freestream Mach number. The data for the skin-friction drag coefficient values were taken from NASA Technical Memorandum 102610. The modeling of the vehicle's SCRAMjet engine is based on quasi 1-dimensional gas dynamics for the engine diffuser, nozzle, and the combustor with heat addition. The engine has three variable inputs for control: the engine inlet diffuser area ratio, the total temperature rise through the combustor due to combustion of the fuel, and the engine internal expansion nozzle area ratio. The pressure distribution over the vehicle's lower aft body surface, which acts as an external nozzle, is calculated using a combination of quasi 1-dimensional gas dynamic theory and Newtonian or modified Newtonian theory. The exhaust plume shape is determined by matching the pressure inside the plume, calculated from the gas dynamic equations, with the freestream pressure, calculated from Newtonian or Modified Newtonian theory. In this manner, the pressure distribution along the vehicle after body

  5. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    PubMed Central

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  6. Behavior model for performance assessment.

    SciTech Connect

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  7. UGV acceptance testing

    NASA Astrophysics Data System (ADS)

    Kramer, Jeffrey A.; Murphy, Robin R.

    2006-05-01

    With over 100 models of unmanned vehicles now available for military and civilian safety, security or rescue applications, it is important to for agencies to establish acceptance testing. However, there appears to be no general guidelines for what constitutes a reasonable acceptance test. This paper describes i) a preliminary method for acceptance testing by a customer of the mechanical and electrical components of an unmanned ground vehicle system, ii) how it has been applied to a man-packable micro-robot, and iii) discusses the value of testing both to ensure that the customer has a workable system and to improve design. The test method automated the operation of the robot to repeatedly exercise all aspects and combinations of components on the robot for 6 hours. The acceptance testing process uncovered many failures consistent with those shown to occur in the field, showing that testing by the user does predict failures. The process also demonstrated that the testing by the manufacturer can provide important design data that can be used to identify, diagnose, and prevent long-term problems. Also, the structured testing environment showed that sensor systems can be used to predict errors and changes in performance, as well as uncovering unmodeled behavior in subsystems.

  8. Larval Exposure to the Juvenile Hormone Analog Pyriproxyfen Disrupts Acceptance of and Social Behavior Performance in Adult Honeybees

    PubMed Central

    Fourrier, Julie; Deschamps, Matthieu; Droin, Léa; Alaux, Cédric; Fortini, Dominique; Beslay, Dominique; Le Conte, Yves; Devillers, James; Aupinel, Pierrick; Decourtye, Axel

    2015-01-01

    Background Juvenile hormone (JH) plays an important role in honeybee development and the regulation of age-related division of labor. However, honeybees can be exposed to insect growth regulators (IGRs), such as JH analogs developed for insect pest and vector control. Although their side effects as endocrine disruptors on honeybee larval or adult stages have been studied, little is known about the subsequent effects on adults of a sublethal larval exposure. We therefore studied the impact of the JH analog pyriproxyfen on larvae and resulting adults within a colony under semi-field conditions by combining recent laboratory larval tests with chemical analysis and behavioral observations. Oral and chronic larval exposure at cumulative doses of 23 or 57 ng per larva were tested. Results Pyriproxyfen-treated bees emerged earlier than control bees and the highest dose led to a significant rate of malformed adults (atrophied wings). Young pyriproxyfen-treated bees were more frequently rejected by nestmates from the colony, inducing a shorter life span. This could be linked to differences in cuticular hydrocarbon (CHC) profiles between control and pyriproxyfen-treated bees. Finally, pyriproxyfen-treated bees exhibited fewer social behaviors (ventilation, brood care, contacts with nestmates or food stocks) than control bees. Conclusion Larval exposure to sublethal doses of pyriproxyfen affected several life history traits of the honeybees. Our results especially showed changes in social integration (acceptance by nestmates and social behaviors performance) that could potentially affect population growth and balance of the colony. PMID:26171610

  9. Potentially acceptable substitutes for the chlorofluorocarbons: properties and performance features of HFC-134a, HCFC-123, and HCFC-141b

    NASA Astrophysics Data System (ADS)

    Sukornick, B.

    1989-05-01

    Potentially acceptable substitutes are known for CFC-11 and CFC-12-the most important Chlorofluorocarbons. HFC-134a could replace CFC-12 in airconditioning and refrigeration and both HCFC-123 and HCFC-141b show promise as CFC-11 substitutes. The replacement molecules all have significantly reduced greenhouse and ozone depletion potentials compared to their fully halogenated counterparts. HCFC-123 is theoretically a less efficient blowing agent than CFC-11, but 141b is more efficient. Results from experimental foaming tests confirm these relationships and show that initial insulating values are slightly lower for 141b and 123 than 11. Both substitutes are nonflammable liquids. Based on its physical properties, HFC-134a is an excellent replacement candidate for CFC-12. In addition, it is more thermally stable than CFC-12. A new family of HFC-134a compatible lubricant oils will be required. The estimated coefficient of performance (COP) of 134a is 96 98% that of CFC-12. Subacute toxicity tests show HFC-134a to have a low order of toxicity. HCFC-123 reveals no serious side effects at a concentration of 0.1% in subchronic tests and the inhalation toxicity of 141b is lower than that of CFC-11 based on a 6-h exposure. Chronic tests on all the new candidates will have to be completed for large-scale commercial use. Allied-Signal is conducting process development at a highly accelerated pace, and we plan to begin commercialization of substitutes within 5 years.

  10. Potentially acceptable substitutes for the chlorofluorocarbons: Properties and performance features of HFC-134a, HCFC-123, and HCFC-141b

    SciTech Connect

    Sukornick, B. )

    1989-05-01

    Potentially acceptable substitutes are known for CFC-11 and CFC-12 - the most important chlorofluorocarbons. HFC-134a could replace CFC-12 in air-conditioning and refrigeration and both HCFC-123 and HCFC-141b show promise as CFC-11 substitutes. The replacement molecules all have significantly reduced greenhouse and ozone depletion potentials compared to their fully halogenated counterparts. HCFC-123 is theoretically a less efficient blowing agent than CFC-11, but 141b is more efficient. Results from experimental foaming tests confirm these relationships and show that initial insulating values are slightly lower for 141 b and 123 than 11. Both substitutes are nonflammable liquids. Based on its physical properties, HFC-134a is an excellent replacement candidate for CFC-12. In addition, it is more thermally stable than CFC-12. A new family of HFC-134a compatible lubricant oils will be required. The estimated coefficient of performance (COP) of 134a is 96-98% that of CFC-12. Subacute toxicity tests show HFC-134a to have a low order of toxicity. HCFC-123 reveals no serious side effects at a concentration of 0.1% in subchronic tests and the inhalation toxicity of 141b is lower than that of CFC-11 based on a 6-h exposure. Chronic tests on all the new candidates will have to be completed for large-scale commercial use. Allied-Signal is conducting process development at a highly accelerated pace, and they plan to begin commercialization of substitutes within 5 years.

  11. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  12. Factors of Online Learning Adoption: A Comparative Juxtaposition of the Theory of Planned Behaviour and the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Ndubisi, Nelson

    2006-01-01

    Organisational investments in information technologies have increased significantly in the past few decades. All around the globe and in Malaysia particularly, a number of educational institutions are experimenting with e-learning. Adopting the theory of planned behaviour (TPB) and the technology acceptance model (TAM) this article tries to…

  13. Modeling nurses' acceptance of bar coded medication administration technology at a pediatric hospital

    PubMed Central

    Brown, Roger L; Scanlon, Matthew C; Karsh, Ben-Tzion

    2012-01-01

    Objective To identify predictors of nurses' acceptance of bar coded medication administration (BCMA). Design Cross-sectional survey of registered nurses (N=83) at an academic pediatric hospital that recently implemented BCMA. Methods Surveys assessed seven BCMA-related perceptions: ease of use; usefulness for the job; social influence from non-specific others to use BCMA; training; technical support; usefulness for patient care; and social influence from patients/families. An all possible subset regression procedure with five goodness-of-fit indicators was used to identify which set of perceptions best predicted BCMA acceptance (intention to use, satisfaction). Results Nurses reported a moderate perceived ease of use and low perceived usefulness of BCMA. Nurses perceived moderate-or-higher social influence to use BCMA and had moderately positive perceptions of BCMA-related training and technical support. Behavioral intention to use BCMA was high, but satisfaction was low. Behavioral intention to use was best predicted by perceived ease of use, perceived social influence from non-specific others, and perceived usefulness for patient care (56% of variance explained). Satisfaction was best predicted by perceived ease of use, perceived usefulness for patient care, and perceived social influence from patients/families (76% of variance explained). Discussion Variation in and low scores on ease of use and usefulness are concerning, especially as these variables often correlate with acceptance, as found in this study. Predicting acceptance benefited from using a broad set of perceptions and adapting variables to the healthcare context. Conclusion Success with BCMA and other technologies can benefit from assessing end-user acceptance and elucidating the factors promoting acceptance and use. PMID:22661559

  14. Investigating IT Faculty Resistance to Learning Management System Adoption Using Latent Variables in an Acceptance Technology Model

    PubMed Central

    Bousbahi, Fatiha; Alrazgan, Muna Saleh

    2015-01-01

    To enhance instruction in higher education, many universities in the Middle East have chosen to introduce learning management systems (LMS) to their institutions. However, this new educational technology is not being used at its full potential and faces resistance from faculty members. To investigate this phenomenon, we conducted an empirical research study to uncover factors influencing faculty members' acceptance of LMS. Thus, in the Fall semester of 2014, Information Technology faculty members were surveyed to better understand their perceptions of the incorporation of LMS into their courses. The results showed that personal factors such as motivation, load anxiety, and organizational support play important roles in the perception of the usefulness of LMS among IT faculty members. These findings suggest adding these constructs in order to extend the Technology acceptance model (TAM) for LMS acceptance, which can help stakeholders of the university to implement the use of this system. This may assist in planning and evaluating the use of e-learning. PMID:26491712

  15. Investigating IT Faculty Resistance to Learning Management System Adoption Using Latent Variables in an Acceptance Technology Model.

    PubMed

    Bousbahi, Fatiha; Alrazgan, Muna Saleh

    2015-01-01

    To enhance instruction in higher education, many universities in the Middle East have chosen to introduce learning management systems (LMS) to their institutions. However, this new educational technology is not being used at its full potential and faces resistance from faculty members. To investigate this phenomenon, we conducted an empirical research study to uncover factors influencing faculty members' acceptance of LMS. Thus, in the Fall semester of 2014, Information Technology faculty members were surveyed to better understand their perceptions of the incorporation of LMS into their courses. The results showed that personal factors such as motivation, load anxiety, and organizational support play important roles in the perception of the usefulness of LMS among IT faculty members. These findings suggest adding these constructs in order to extend the Technology acceptance model (TAM) for LMS acceptance, which can help stakeholders of the university to implement the use of this system. This may assist in planning and evaluating the use of e-learning. PMID:26491712

  16. Optical Performance Modeling of FUSE Telescope Mirror

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Ohl, Raymond G.; Friedman, Scott D.; Moos, H. Warren

    2000-01-01

    We describe the Metrology Data Processor (METDAT), the Optical Surface Analysis Code (OSAC), and their application to the image evaluation of the Far Ultraviolet Spectroscopic Explorer (FUSE) mirrors. The FUSE instrument - designed and developed by the Johns Hopkins University and launched in June 1999 is an astrophysics satellite which provides high resolution spectra (lambda/Delta(lambda) = 20,000 - 25,000) in the wavelength region from 90.5 to 118.7 nm The FUSE instrument is comprised of four co-aligned, normal incidence, off-axis parabolic mirrors, four Rowland circle spectrograph channels with holographic gratings, and delay line microchannel plate detectors. The OSAC code provides a comprehensive analysis of optical system performance, including the effects of optical surface misalignments, low spatial frequency deformations described by discrete polynomial terms, mid- and high-spatial frequency deformations (surface roughness), and diffraction due to the finite size of the aperture. Both normal incidence (traditionally infrared, visible, and near ultraviolet mirror systems) and grazing incidence (x-ray mirror systems) systems can be analyzed. The code also properly accounts for reflectance losses on the mirror surfaces. Low frequency surface errors are described in OSAC by using Zernike polynomials for normal incidence mirrors and Legendre-Fourier polynomials for grazing incidence mirrors. The scatter analysis of the mirror is based on scalar scatter theory. The program accepts simple autocovariance (ACV) function models or power spectral density (PSD) models derived from mirror surface metrology data as input to the scatter calculation. The end product of the program is a user-defined pixel array containing the system Point Spread Function (PSF). The METDAT routine is used in conjunction with the OSAC program. This code reads in laboratory metrology data in a normalized format. The code then fits the data using Zernike polynomials for normal incidence

  17. Integrating Telemedicine for Disaster Response: Testing the Emergency Telemedicine Technology Acceptance Model

    ERIC Educational Resources Information Center

    Davis, Theresa M.

    2013-01-01

    Background: There is little evidence that technology acceptance is well understood in healthcare. The hospital environment is complex and dynamic creating a challenge when new technology is introduced because it impacts current processes and workflows which can significantly affect patient care delivery and outcomes. This study tested the effect…

  18. Using the UTAUT Model to Examine the Acceptance Behavior of Synchronous Collaboration to Support Peer Translation

    ERIC Educational Resources Information Center

    Liu, Yi Chun; Huang, Yong-Ming

    2015-01-01

    The teaching of translation has received considerable attention in recent years. Research on translation in collaborative learning contexts, however, has been less studied. In this study, we use a tool of synchronous collaboration to assist students in experiencing a peer translation process. Afterward, the unified theory of acceptance and use of…

  19. WebCT--The Quasimoderating Effect of Perceived Affective Quality on an Extending Technology Acceptance Model

    ERIC Educational Resources Information Center

    Sanchez-Franco, Manuel J.

    2010-01-01

    Perceived affective quality is an attractive area of research in Information System. Specifically, understanding the intrinsic and extrinsic individual factors and interaction effects that influence Information and Communications Technology (ICT) acceptance and adoption--in higher education--continues to be a focal interest in learning research.…

  20. Perceived Playfulness, Gender Differences and Technology Acceptance Model in a Blended Learning Scenario

    ERIC Educational Resources Information Center

    Padilla-Melendez, Antonio; del Aguila-Obra, Ana Rosa; Garrido-Moreno, Aurora

    2013-01-01

    The importance of technology for education is increasing year-by-year at all educational levels and particularly for Universities. This paper reexamines one important determinant of technology acceptance and use, such as perceived playfulness in the context of a blended learning setting and reveals existing gender differences. After a literature…

  1. An Investigation of University Student Readiness Towards M-Learning Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Iqbal, Shakeel; Bhatti, Zeeshan Ahmed

    2015-01-01

    M-learning is learning delivered via mobile devices and mobile technology. The research indicates that this medium of learning has potential to enhance formal as well as informal learning. However, acceptance of m-learning greatly depends upon the personal attitude of students towards this medium; therefore this study focuses only on the…

  2. Exploring Students' Intention to Use LINE for Academic Purposes Based on Technology Acceptance Model

    ERIC Educational Resources Information Center

    Van De Bogart, Willard; Wichadee, Saovapa

    2015-01-01

    The LINE application is often conceived as purely social space; however, the authors of this paper wanted to determine if it could be used for academic purposes. In this study, we examined how undergraduate students accepted LINE in terms of using it for classroom-related activities (e.g., submit homework, follow up course information queries,…

  3. Adult Role Models: Feasibility, Acceptability, and Initial Outcomes for Sex Education

    ERIC Educational Resources Information Center

    Colarossi, Lisa; Silver, Ellen Johnson; Dean, Randa; Perez, Amanda; Rivera, Angelic

    2014-01-01

    The authors present the feasibility and acceptability of a parent sexuality education program led by peer educators in community settings. They also report the results of an outcome evaluation with 71 parents who were randomized to the intervention or a control group and surveyed one month prior to and six months after the four-week intervention.…

  4. Preventing repetition of attempted suicide--I. Feasibility (acceptability, adherence, and effectiveness) of a Baerum-model like aftercare.

    PubMed

    Hvid, Marianne; Wang, August G

    2009-01-01

    Repetition after attempted suicide is high with only limited research been put into effect studies. The Baerum-model from Norway offers a practical and affordable intervention. Our aim was to study the acceptability and effectiveness of a Baerum-model like intervention after attempted suicide using a quasi-experimental design. During a period in 2004, attempted suicide patients were offered follow-up care by a rapid-response outreach programme, an intervention lasting 6 months; a control group was established prospectively from a similar period in 2002. The design was an intent-to-treat analysis. The outcome was measured by: 1) participation by acceptance and adherence, 2) repetition of suicide attempt and suicide, and 3) including the number of repetitive acts in 1 year after the attempted suicide episode. Follow-up period was 1 year. Participation was 70%. There was a significant lower repetition rate in the intervention group, where the proportion of repetitive patients fell from 34% to 14%. There were also fewer suicidal acts, in total 37 acts in 58 patients in the control group and 22 acts in 93 patients for the intervention group. We have concluded that the outreach programme has a good feasibility because of high acceptability and adherence, and has an acceptable effectiveness in the follow up period of 1 year. We have therefore initiated a similar study using a randomization design in order to study efficacy. PMID:19016074

  5. Stress exposure and generation: A conjoint longitudinal model of body dysmorphic symptoms, peer acceptance, popularity, and victimization.

    PubMed

    Webb, Haley J; Zimmer-Gembeck, Melanie J; Mastro, Shawna

    2016-09-01

    This study examined the bidirectional (conjoint) longitudinal pathways linking adolescents' body dysmorphic disorder (BDD) symptoms with self- and peer-reported social functioning. Participants were 367 Australian students (45.5% boys, mean age=12.01 years) who participated in two waves of a longitudinal study with a 12-month lag between assessments. Participants self-reported their symptoms characteristic of BDD, and perception of peer acceptance. Classmates reported who was popular and victimized in their grade, and rated their liking (acceptance) of their classmates. In support of both stress exposure and stress generation models, T1 victimization was significantly associated with more symptoms characteristic of BDD at T2 relative to T1, and higher symptom level at T1 was associated with lower perceptions of peer acceptance at T2 relative to T1. These results support the hypothesized bidirectional model, whereby adverse social experiences negatively impact symptoms characteristic of BDD over time, and symptoms also exacerbate low perceptions of peer-acceptance. PMID:27236472

  6. A predictive model of human performance.

    NASA Technical Reports Server (NTRS)

    Walters, R. F.; Carlson, L. D.

    1971-01-01

    An attempt is made to develop a model describing the overall responses of humans to exercise and environmental stresses for prediction of exhaustion vs an individual's physical characteristics. The principal components of the model are a steady state description of circulation and a dynamic description of thermal regulation. The circulatory portion of the system accepts changes in work load and oxygen pressure, while the thermal portion is influenced by external factors of ambient temperature, humidity and air movement, affecting skin blood flow. The operation of the model is discussed and its structural details are given.

  7. Public Education Resources and Pupil Performance Models.

    ERIC Educational Resources Information Center

    Spottheim, David; And Others

    This report details three models quantifying the relationships between educational means (resources) and ends (pupil achievements) to analyze resource allocation problems within school districts: (1) the Pupil Performance Model; (2) the Goal Programming Model; and (3) the Operational Structure of a School and Pupil Performance Model. These models…

  8. Rehabilitation Counseling for Athletes Prior to Retirement: A Preventative Approach Using Self-Acceptance To Enhance Performance before and after Retirement.

    ERIC Educational Resources Information Center

    Mills, Brett D.

    This paper suggests that collegiate and professional athletes preparing to retire should be provided with preretirement and postretirement rehabilitation counseling. The counseling should involve a preventative approach centered around self-acceptance, to enhance the athlete's performance before and after retirement. The development of…

  9. Human Performance Models of Pilot Behavior

    NASA Technical Reports Server (NTRS)

    Foyle, David C.; Hooey, Becky L.; Byrne, Michael D.; Deutsch, Stephen; Lebiere, Christian; Leiden, Ken; Wickens, Christopher D.; Corker, Kevin M.

    2005-01-01

    Five modeling teams from industry and academia were chosen by the NASA Aviation Safety and Security Program to develop human performance models (HPM) of pilots performing taxi operations and runway instrument approaches with and without advanced displays. One representative from each team will serve as a panelist to discuss their team s model architecture, augmentations and advancements to HPMs, and aviation-safety related lessons learned. Panelists will discuss how modeling results are influenced by a model s architecture and structure, the role of the external environment, specific modeling advances and future directions and challenges for human performance modeling in aviation.

  10. Performance and Architecture Lab Modeling Tool

    SciTech Connect

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior

  11. Performance and Architecture Lab Modeling Tool

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, itmore » formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this link makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program

  12. METAPHOR (version 1): Users guide. [performability modeling

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  13. Performance Evaluation of Dense Gas Dispersion Models.

    NASA Astrophysics Data System (ADS)

    Touma, Jawad S.; Cox, William M.; Thistle, Harold; Zapert, James G.

    1995-03-01

    This paper summarizes the results of a study to evaluate the performance of seven dense gas dispersion models using data from three field experiments. Two models (DEGADIS and SLAB) are in the public domain and the other five (AIRTOX, CHARM, FOCUS, SAFEMODE, and TRACE) are proprietary. The field data used are the Desert Tortoise pressurized ammonia releases, Burro liquefied natural gas spill tests, and the Goldfish anhydrous hydrofluoric acid spill experiments. Desert Tortoise and Goldfish releases were simulated as horizontal jet releases, and Burro as a liquid pool. Performance statistics were used to compare maximum observed concentrations and plume half-width to those predicted by each model. Model performance varied and no model exhibited consistently good performance across all three databases. However, when combined across the three databases, all models performed within a factor of 2. Problems encountered are discussed in order to help future investigators.

  14. Summary of photovoltaic system performance models

    SciTech Connect

    Smith, J. H.; Reiter, L. J.

    1984-01-15

    The purpose of this study is to provide a detailed overview of photovoltaics (PV) performance modeling capabilities that have been developed during recent years for analyzing PV system and component design and policy issues. A set of 10 performance models have been selected which span a representative range of capabilities from generalized first-order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Next, each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. Then each of the issues is discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. Finally, the models are grouped into categories to illustrate their purposes and perspectives.

  15. Summary of photovoltaic system performance models

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. J.

    1984-01-01

    A detailed overview of photovoltaics (PV) performance modeling capabilities developed for analyzing PV system and component design and policy issues is provided. A set of 10 performance models are selected which span a representative range of capabilities from generalized first order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. The issues are discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. The models are grouped into categories to illustrate their purposes and perspectives.

  16. An Empirical Analysis of Citizens' Acceptance Decisions of Electronic-Government Services: A Modification of the Unified Theory of Acceptance and Use of Technology (UTAUT) Model to Include Trust as a Basis for Investigation

    ERIC Educational Resources Information Center

    Awuah, Lawrence J.

    2012-01-01

    Understanding citizens' adoption of electronic-government (e-government) is an important topic, as the use of e-government has become an integral part of governance. Success of such initiatives depends largely on the efficient use of e-government services. The unified theory of acceptance and use of technology (UTAUT) model has provided a…

  17. HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL: USER'S GUIDE FOR VERSION 3

    EPA Science Inventory

    The Hydrologic Evaluation of Landfill Performance (HELP) computer program is a quasi-two-dimensional hydrologic model of water movement across, into, through and out of landfills. he model accepts weather, soil and design data. andfill systems including various combinations of ve...

  18. Virginia Higher Education Performance Funding Model.

    ERIC Educational Resources Information Center

    Virginia State Council of Higher Education, Richmond.

    This report reviews the proposed Virginia Higher Education Performance Funding Model. It includes an overview of the proposed funding model, examples of likely funding scenarios (including determination of block grants, assumptions underlying performance funding for four-year and two-year institutions); information on deregulation/decentralization…

  19. Acceptability of the Nestorone®/Ethinyl Estradiol Contraceptive Vaginal Ring: Development of a Model; Implications for Introduction

    PubMed Central

    Merkatz, Ruth B.; Plagianos, Marlena; Hoskin, Elena; Cooney, Michael; Hewett, Paul C; Mensch, Barbara S.

    2015-01-01

    Objectives Develop and test a theoretical acceptability model for the Nestorone®/ethinyl estradiol (NES/EE) contraceptive vaginal ring (CVR); explore whether domains of use within the model predict satisfaction, method adherence and CVR continuation. Study Design Four domains of use were considered relative to outcome markers of acceptability, i.e. method satisfaction, adherence, and continuation. A questionnaire to evaluate subjects’ experiences relative to the domains, their satisfaction (Likert scale), and adherence to instructions for use was developed and administered to 1036 women enrolled in a 13-cycle Phase 3 trial. Method continuation was documented from the trial database. Stepwise logistic regression (LR) analysis was conducted and odds ratios calculated to assess associations of satisfaction with questions from the 4 domains. Fisher’s exact test was used to determine the association of satisfaction with outcome measures. Results A final acceptability model was developed based on the following determinants of CVR satisfaction: ease of use, side effects, expulsions/feeling the CVR, and sexual activity including physical effects during intercourse. Satisfaction was high (89%) and related to higher method adherence [OR 2.6(1.3,5.2)] and continuation [OR5.5(3.5, 8.4)]. According to the LR analysis, attributes of CVR use representing items from the 4 domains — finding it easy to remove, not complaining of side effects, not feeling the CVR while wearing it, and experiencing no change or an increase in sexual pleasure and/or frequency — were associated with higher odds of satisfaction. Conclusion Hypothesized domains of CVR use were related to satisfaction, which was associated with adherence and continuation. Results provide a scientific basis for introduction and future research. PMID:24993487

  20. Using combined hydrological variables for extracting functional signatures of catchments to better assess the acceptability of model structures in conceptual catchment modelling

    NASA Astrophysics Data System (ADS)

    Fovet, O.; Hrachowitz, M.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.

    2013-12-01

    While most hydrological models reproduce the general flow dynamics of a system, they frequently fail to adequately mimic system internal processes. This is likely to make them inadequate to simulate solutes transport. For example, the hysteresis between storage and discharge, which is often observed in shallow hard-rock aquifers, is rarely well reproduced by models. One main reason is that this hysteresis has little weight in the calibration because objective functions are based on time series of individual variables. This reduces the ability of classical calibration/validation procedures to assess the relevance of the conceptual hypothesis associated with hydrological models. Calibrating models on variables derived from the combination of different individual variables (like stream discharge and groundwater levels) is a way to insure that models will be accepted based on their consistency. Here we therefore test the value of this more systems-like approach to test different hypothesis on the behaviour of a small experimental low-land catchment in French Brittany (ORE AgrHys) where a high hysteresis is observed on the stream flow vs. shallow groundwater level relationship. Several conceptual models were applied to this site, and calibrated using objective functions based on metrics of this hysteresis. The tested model structures differed with respect to the storage function in each reservoir, the storage-discharge function in each reservoir, the deep loss expressions (as constant or variable fraction), the number of reservoirs (from 1 to 4) and their organization (parallel, series). The observed hysteretic groundwater level-discharge relationship was not satisfactorily reproduced by most of the tested models except for the most complex ones. Those were thus more consistent, their underlying hypotheses are probably more realistic even though their performance for simulating observed stream flow was decreased. Selecting models based on such systems-like approach is

  1. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  2. Generic CSP Performance Model for NREL's System Advisor Model: Preprint

    SciTech Connect

    Wagner, M. J.; Zhu, G.

    2011-08-01

    The suite of concentrating solar power (CSP) modeling tools in NREL's System Advisor Model (SAM) includes technology performance models for parabolic troughs, power towers, and dish-Stirling systems. Each model provides the user with unique capabilities that are catered to typical design considerations seen in each technology. Since the scope of the various models is generally limited to common plant configurations, new CSP technologies, component geometries, and subsystem combinations can be difficult to model directly in the existing SAM technology models. To overcome the limitations imposed by representative CSP technology models, NREL has developed a 'Generic Solar System' (GSS) performance model for use in SAM. This paper discusses the formulation and performance considerations included in this model and verifies the model by comparing its results with more detailed models.

  3. Intern Performance in Three Supervisory Models

    ERIC Educational Resources Information Center

    Womack, Sid T.; Hanna, Shellie L.; Callaway, Rebecca; Woodall, Peggy

    2011-01-01

    Differences in intern performance, as measured by a Praxis III-similar instrument were found between interns supervised in three supervisory models: Traditional triad model, cohort model, and distance supervision. Candidates in this study's particular form of distance supervision were not as effective as teachers as candidates in…

  4. Photovoltaic performance models - A report card

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  5. Using the Technology Acceptance Model (TAM) to Conduct an Analysis of User Perceptions

    ERIC Educational Resources Information Center

    Young, Cheryl E.

    2010-01-01

    Contractor performance evaluation enables agents to make informed purchasing decisions, provide feedback to contractors/vendors, and help improve service quality and customer satisfaction. However, both public and private organizations sometimes fail to do so, as is the case at a division of a government organization located in a northeastern U.S.…

  6. Assessing the Determinants of Information Technology Adoption in Jamaica's Public Sector Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Thompson, Thomas, II.

    2010-01-01

    Superior performance improvement and productivity gains are normally achieved when labor or ordinary capital is substituted by information technology (IT) in organizations. Consequently, on average, organizations have spent more than 50% of their total capital budget on IT, but have not gained commensurate return on their investments, partly due…

  7. Modeling time-lagged reciprocal psychological empowerment-performance relationships.

    PubMed

    Maynard, M Travis; Luciano, Margaret M; D'Innocenzo, Lauren; Mathieu, John E; Dean, Matthew D

    2014-11-01

    Employee psychological empowerment is widely accepted as a means for organizations to compete in increasingly dynamic environments. Previous empirical research and meta-analyses have demonstrated that employee psychological empowerment is positively related to several attitudinal and behavioral outcomes including job performance. While this research positions psychological empowerment as an antecedent influencing such outcomes, a close examination of the literature reveals that this relationship is primarily based on cross-sectional research. Notably, evidence supporting the presumed benefits of empowerment has failed to account for potential reciprocal relationships and endogeneity effects. Accordingly, using a multiwave, time-lagged design, we model reciprocal relationships between psychological empowerment and job performance using a sample of 441 nurses from 5 hospitals. Incorporating temporal effects in a staggered research design and using structural equation modeling techniques, our findings provide support for the conventional positive correlation between empowerment and subsequent performance. Moreover, accounting for the temporal stability of variables over time, we found support for empowerment levels as positive influences on subsequent changes in performance. Finally, we also found support for the reciprocal relationship, as performance levels were shown to relate positively to changes in empowerment over time. Theoretical and practical implications of the reciprocal psychological empowerment-performance relationships are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved). PMID:25111249

  8. Improving Technology Acceptance Modeling for Disadvantaged Communities Using a Systems Engineering Approach

    ERIC Educational Resources Information Center

    Fletcher, Jordan L.

    2013-01-01

    Developing nations are poised to spend billions on information and communication technology (ICT) innovation in 2020. A study of the historical adoption of ICT in developing nations has indicated that their adoption patterns do not follow typical technology innovation adoption models. This study addressed the weaknesses found in existing…

  9. Parental modelling and prompting effects on acceptance of a novel fruit in 2-4-year-old children are dependent on children's food responsiveness.

    PubMed

    Blissett, Jackie; Bennett, Carmel; Fogel, Anna; Harris, Gillian; Higgs, Suzanne

    2016-02-14

    Few children consume the recommended portions of fruit or vegetables. This study examined the effects of parental physical prompting and parental modelling in children's acceptance of a novel fruit (NF) and examined the role of children's food-approach and food-avoidance traits on NF engagement and consumption. A total of 120 caregiver-child dyads (fifty-four girls, sixty-six boys) participated in this study. Dyads were allocated to one of the following three conditions: physical prompting but no modelling, physical prompting and modelling or a modelling only control condition. Dyads ate a standardised meal containing a portion of a fruit new to the child. Parents completed measures of children's food approach and avoidance. Willingness to try the NF was observed, and the amount of the NF consumed was measured. Physical prompting but no modelling resulted in greater physical refusal of the NF. There were main effects of enjoyment of food and food fussiness on acceptance. Food responsiveness interacted with condition such that children who were more food responsive had greater NF acceptance in the prompting and modelling conditions in comparison with the modelling only condition. In contrast, children with low food responsiveness had greater acceptance in the modelling control condition than in the prompting but no modelling condition. Physical prompting in the absence of modelling is likely to be detrimental to NF acceptance. Parental use of physical prompting strategies, in combination with modelling of NF intake, may facilitate acceptance of NF, but only in food-responsive children. Modelling consumption best promotes acceptance in children with low food responsiveness. PMID:26603382

  10. Performance testing of the Silo Flow Model

    SciTech Connect

    Stadler, S.P.; O`Connor, D.; Gould, A.F.

    1994-12-31

    Several instruments are commercially available for on-line analysis of coal properties such as total moisture, ash, sulfur, and mineral matter content. These instruments have found use in coal cleaning and coal-fired utility applications. However, in many instances, the coal is stored in large bunkers or silos after on-line analysis, making the data gathered from on-line analysis a poor predictor of short-term coal quality due to the flow pattern and mixing within the silo. A computerized model, the Silo Flow Model, has been developed to model the flow of coal through a silo or bunker thus providing a prediction of the output coal quality based on on-line measurements of the quality of coal entering the silo. A test procedure was developed and demonstrated to test the performance of the Silo Flow Model. The testing was performed using controlled addition of silver nitrate to the coal, in conjunction with surface profile measurements using an array of ultrasonic gauges and data acquired from plant instrumentation. Results obtained from initial testing provided estimates of flow-related parameters used in the Silo flow Model. Similar test techniques are also used to compare predicted and actual silver content at the silo outlet as a measure of model performance. This paper describes test procedures used to validate the Silo Flow Model, the testing program, and the results obtained to data. The Silo Flow Model performance is discussed and compared against other modeling approaches.

  11. Analytical Ion Thruster Discharge Performance Model

    NASA Technical Reports Server (NTRS)

    Goebel, Dan M.; Wirz, Richard E.; Katz, Ira

    2006-01-01

    A particle and energy balance model of the plasma discharge in magnetic ring-cusp ion thrusters has been developed. The model follows the original work of Brophy in the development of global 0-D discharge models that utilize conservation of particles into and out of the thruster and conservation of energy into the discharge and out of the plasma in the form of charged particles to the walls and beam and plasma radiation. The present model is significantly expanded over Brophy's original work by including self-consistent calculations of the internal neutral pressure, electron temperature, primary electron density, electrostatic ion confinement (due to the ring-cusp fields), plasma potential, discharge stability, and time dependent behavior during recycling. The model only requires information on the thruster geometry, ion optics performance and electrical inputs such as discharge voltage and currents, etc. to produce accurate performance curves of discharge loss versus mass utilization efficiency. The model has been benchmarked against the NEXIS Laboratory Model (LM) and Development Model (DM) thrusters, and successfully predicts the thruster discharge loss as a function of mass utilization efficiency for a variety of thrusters. The discharge performance model will be presented and results showing ion thruster performance and stability given.

  12. Cost and Performance Model for Photovoltaic Systems

    NASA Technical Reports Server (NTRS)

    Borden, C. S.; Smith, J. H.; Davisson, M. C.; Reiter, L. J.

    1986-01-01

    Lifetime cost and performance (LCP) model assists in assessment of design options for photovoltaic systems. LCP is simulation of performance, cost, and revenue streams associated with photovoltaic power systems connected to electric-utility grid. LCP provides user with substantial flexibility in specifying technical and economic environment of application.

  13. A comparison of model performance between AERMOD and AUSTAL2000.

    PubMed

    Langner, Christian; Klemm, Otto

    2011-06-01

    In this study the performance of the American Meteorological Society and U.S. Environmental Protection Agency Regulatory Model (AERMOD), a Gaussian plume model, is compared in five test cases with the German Dispersion Model according to the Technical Instructions on Air Quality Control (Ausbreitungsmodell gemäbeta der Technischen Anleitung zur Reinhaltung der Luft) (AUSTAL2000), a Lagrangian model. The test cases include different source types, rural and urban conditions, flat and complex terrain. The predicted concentrations are analyzed and compared with field data. For evaluation, quantile-quantile plots were used. Further, a performance measure is applied that refers to the upper end of concentration levels because this is the concentration range of utmost importance and interest for regulatory purposes. AERMOD generally predicted concentrations closer to the field observations. AERMOD and AUSTAL2000 performed considerably better when they included the emitting power plant building, indicating that the downwash effect near a source is an important factor. Although AERMOD handled mountainous terrain well, AUSTAL2000 significantly underestimated the concentrations under these conditions. In the urban test case AUSTAL2000 did not perform satisfactorily. This may be because AUSTAL2000, in contrast to AERMOD, does not use any algorithm for nightly turbulence as caused by the urban heat island effect. Both models performed acceptable for a nonbuoyant volume source. AUSTAL2000 had difficulties in stable conditions, resulting in severe underpredictions. This analysis indicates that AERMOD is the stronger model compared with AUSTAL2000 in cases with complex and urban terrain. The reasons for that seem to be AUSTAL2000's simplification of the meteorological input parameters and imprecision because of rounding errors. PMID:21751580

  14. Regulatory acceptance and use of 3R models for pharmaceuticals and chemicals: expert opinions on the state of affairs and the way forward.

    PubMed

    Schiffelers, Marie-Jeanne W A; Blaauboer, Bas J; Bakker, Wieger E; Beken, Sonja; Hendriksen, Coenraad F M; Koëter, Herman B W M; Krul, Cyrille

    2014-06-01

    Pharmaceuticals and chemicals are subjected to regulatory safety testing accounting for approximately 25% of laboratory animal use in Europe. This testing meets various objections and has led to the development of a range of 3R models to Replace, Reduce or Refine the animal models. However, these models must overcome many barriers before being accepted for regulatory risk management purposes. This paper describes the barriers and drivers and options to optimize this acceptance process as identified by two expert panels, one on pharmaceuticals and one on chemicals. To untangle the complex acceptance process, the multilevel perspective on technology transitions is applied. This perspective defines influences at the micro-, meso- and macro level which need alignment to induce regulatory acceptance of a 3R model. This paper displays that there are many similar mechanisms within both sectors that prevent 3R models from becoming accepted for regulatory risk assessment and management. Shared barriers include the uncertainty about the value of the new 3R models (micro level), the lack of harmonization of regulatory requirements and acceptance criteria (meso level) and the high levels of risk aversion (macro level). In optimizing the process commitment, communication, cooperation and coordination are identified as critical drivers. PMID:24534000

  15. Performance Engineering in the Community Atmosphere Model

    SciTech Connect

    Worley, P; Mirin, A; Drake, J; Sawyer, W

    2006-05-30

    The Community Atmosphere Model (CAM) is the atmospheric component of the Community Climate System Model (CCSM) and is the primary consumer of computer resources in typical CCSM simulations. Performance engineering has been an important aspect of CAM development throughout its existence. This paper briefly summarizes these efforts and their impacts over the past five years.

  16. Performance modeling of nonconcentrating solar detoxification systems

    SciTech Connect

    March, M.; Martin, A.; Saltiel, C.

    1995-03-01

    A detailed simulation model is developed for predicting the performance of solar detoxification systems. Concentration profiles are determined via a method of lines approach during sunlight hours for acquired and synthetic (simulating clear and cloudy days) ultraviolet radiation intensity data. Verification of the model is performed with comparison against indoor laboratory and outdoor field test results. Simulations are performed over a range of design parameters to examine system sensitivity. Discussions are focused on the determination of optimal sizing and operating conditions. 17 refs., 8 figs.

  17. The Effects of a Brief Acceptance-based Behavior Therapy vs. Traditional Cognitive Behavior Therapy for Public Speaking Anxiety: Differential Effects on Performance and Verbal Working Memory

    NASA Astrophysics Data System (ADS)

    Glassman, Lisa Hayley

    Individuals with public speaking phobia experience fear and avoidance that can cause extreme distress, impaired speaking performance, and associated problems in psychosocial functioning. Most extant interventions for public speaking phobia focus on the reduction of anxiety and avoidance, but neglect performance. Additionally, very little is known about the relationship between verbal working memory and social performance under conditions of high anxiety. The current study compared the efficacy of two cognitive behavioral treatments, traditional Cognitive Behavioral Therapy (tCBT) and acceptance-based behavior therapy (ABBT), in enhancing public speaking performance via coping with anxiety. Verbal working memory performance, as measured by the backwards digit span (BDS), was measured to explore the relationships between treatment type, anxiety, performance, and verbal working memory. We randomized 30 individuals with high public speaking anxiety to a 90-minute ABBT or tCBT intervention. As this pilot study was underpowered, results are examined in terms of effect sizes as well as statistical significance. Assessments took place at pre and post-intervention and included self-rated and objective anxiety measurements, a behavioral assessment, ABBT and tCBT process measures, and backwards digit span verbal working memory tests. In order to examine verbal working memory during different levels of anxiety and performance pressure, we gave each participant a backwards digit span task three times during each assessment: once under calm conditions, then again while experiencing anticipatory anxiety, and finally under conditions of acute social performance anxiety in front of an audience. Participants were asked to give a video-recorded speech in front of the audience at pre- and post-intervention to examine speech performance. Results indicated that all participants experienced a very large and statistically significant decrease in anxiety (both during the speech and BDS

  18. Exercise motives and positive body image in physically active college women and men: Exploring an expanded acceptance model of intuitive eating.

    PubMed

    Tylka, Tracy L; Homan, Kristin J

    2015-09-01

    The acceptance model of intuitive eating posits that body acceptance by others facilitates body appreciation and internal body orientation, which contribute to intuitive eating. Two domains of exercise motives (functional and appearance) may also be linked to these variables, and thus were integrated into the model. The model fit the data well for 406 physically active U.S. college students, although some pathways were stronger for women. Body acceptance by others directly contributed to higher functional exercise motives and indirectly contributed to lower appearance exercise motives through higher internal body orientation. Functional exercise motives positively, and appearance exercise motives inversely, contributed to body appreciation. Whereas body appreciation positively, and appearance exercise motives inversely, contributed to intuitive eating for women, only the latter association was evident for men. To benefit positive body image and intuitive eating, efforts should encourage body acceptance by others and emphasize functional and de-emphasize appearance exercise motives. PMID:26281958

  19. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  20. Modeling and analysis of web portals performance

    NASA Astrophysics Data System (ADS)

    Abdul Rahim, Rahela; Ibrahim, Haslinda; Syed Yahaya, Sharipah Soaad; Khalid, Khairini

    2011-10-01

    The main objective of this study is to develop a model based on queuing theory at a system level of web portals performance for a university. A system level performance model views the system being modeled as a 'black box' which considers the arrival rate of packets to the portals server and service rate of the portals server. These two parameters are important elements to measure Web portals performance metrics such as server utilization, average server throughput, average number of packet in the server and mean response time. This study refers to infinite population and finite queue. The proposed analytical model is simple in such a way that it is easy to define and fast to interpret the results but still represents the real situation.

  1. "Acceptance of the Limits of Knowability in Oneself and Others": Performative Politics and Relational Ethics in the Primary School Classroom

    ERIC Educational Resources Information Center

    Teague, Laura

    2015-01-01

    This paper takes up Judith Butler's calls to suspend the desire to completely know the other, and discusses these in relation to the pedagogic relationship in the classroom. It draws upon existing accounts of performative reinscription as a politics to disrupt exclusionary schooling practices and discusses these alongside Butler's theories of…

  2. 3D-manufactured patient-specific models of congenital heart defects for communication in clinical practice: feasibility and acceptability

    PubMed Central

    Biglino, Giovanni; Capelli, Claudio; Wray, Jo; Schievano, Silvia; Leaver, Lindsay-Kay; Khambadkone, Sachin; Giardini, Alessandro; Derrick, Graham; Jones, Alexander; Taylor, Andrew M

    2015-01-01

    Objectives To assess the communication potential of three-dimensional (3D) patient-specific models of congenital heart defects and their acceptability in clinical practice for cardiology consultations. Design This was a questionnaire-based study in which participants were randomised into two groups: the ‘model group’ received a 3D model of the cardiac lesion(s) being discussed during their appointment, while the ‘control group’ had a routine visit. Setting Outpatient clinic, cardiology follow-up visits. Participants 103 parents of children with congenital heart disease were recruited (parental age: 43±8 years; patient age: 12±6 years). In order to have a 3D model made, patients needed to have a recent cardiac MRI examination; this was the crucial inclusion criterion. Interventions Questionnaires were administered to the participants before and after the visits and an additional questionnaire was administered to the attending cardiologist. Main outcome measures Rating (1–10) for the liking of the 3D model, its usefulness and the clarity of the explanation received were recorded, as well as rating (1–10) of the parental understanding and their engagement according to the cardiologist. Furthermore, parental knowledge was assessed by asking them to mark diagrams, tick keywords and provide free text answers. The duration of consultations was recorded and parent feedback collected. Results Parents and cardiologists both found the models to be very useful and helpful in engaging the parents in discussing congenital heart defects. Parental knowledge was not associated with their level of education (p=0.2) and did not improve following their visit. Consultations involving 3D models lasted on average 5 min longer (p=0.02). Conclusions Patient-specific models can enhance engagement with parents and improve communication between cardiologists and parents, potentially impacting on parent and patient psychological adjustment following treatment. However, in

  3. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine

  4. Regulatory acceptance of animal models of disease to support clinical trials of medicines and advanced therapy medicinal products.

    PubMed

    Cavagnaro, Joy; Silva Lima, Beatriz

    2015-07-15

    The utility of animal models of disease for assessing the safety of novel therapeutic modalities has become an increasingly important topic of discussion as research and development efforts focus on improving the predictive value of animal studies to support accelerated clinical development. Medicines are approved for marketing based upon a determination that their benefits outweigh foreseeable risks in specific indications, specific populations, and at specific dosages and regimens. No medicine is 100% safe. A medicine is less safe if the actual risks are greater than the predicted risks. The purpose of preclinical safety assessment is to understand the potential risks to aid clinical decision-making. Ideally preclinical studies should identify potential adverse effects and design clinical studies that will minimize their occurrence. Most regulatory documents delineate the utilization of conventional "normal" animal species to evaluate the safety risk of new medicines (i.e., new chemical entities and new biological entities). Animal models of human disease are commonly utilized to gain insight into the pathogenesis of disease and to evaluate efficacy but less frequently utilized in preclinical safety assessment. An understanding of the limitations of the animal disease models together with a better understanding of the disease and how toxicity may be impacted by the disease condition should allow for a better prediction of risk in the intended patient population. Importantly, regulatory authorities are becoming more willing to accept and even recommend data from experimental animal disease models that combine efficacy and safety to support clinical development. PMID:25814257

  5. Acceptance, values, and probability.

    PubMed

    Steel, Daniel

    2015-10-01

    This essay makes a case for regarding personal probabilities used in Bayesian analyses of confirmation as objects of acceptance and rejection. That in turn entails that personal probabilities are subject to the argument from inductive risk, which aims to show non-epistemic values can legitimately influence scientific decisions about which hypotheses to accept. In a Bayesian context, the argument from inductive risk suggests that value judgments can influence decisions about which probability models to accept for likelihoods and priors. As a consequence, if the argument from inductive risk is sound, then non-epistemic values can affect not only the level of evidence deemed necessary to accept a hypothesis but also degrees of confirmation themselves. PMID:26386533

  6. Performance modeling for large database systems

    NASA Astrophysics Data System (ADS)

    Schaar, Stephen; Hum, Frank; Romano, Joe

    1997-02-01

    One of the unique approaches Science Applications International Corporation took to meet performance requirements was to start the modeling effort during the proposal phase of the Interstate Identification Index/Federal Bureau of Investigations (III/FBI) project. The III/FBI Performance Model uses analytical modeling techniques to represent the III/FBI system. Inputs to the model include workloads for each transaction type, record size for each record type, number of records for each file, hardware envelope characteristics, engineering margins and estimates for software instructions, memory, and I/O for each transaction type. The model uses queuing theory to calculate the average transaction queue length. The model calculates a response time and the resources needed for each transaction type. Outputs of the model include the total resources needed for the system, a hardware configuration, and projected inherent and operational availability. The III/FBI Performance Model is used to evaluate what-if scenarios and allows a rapid response to engineering change proposals and technical enhancements.

  7. Bay-annulated indigo (BAI) as an excellent electron accepting building block for high performance organic semiconductors

    DOEpatents

    Liu, Yi; He, Bo; Pun, Andrew

    2015-11-24

    A novel electron acceptor based on bay-annulated indigo (BAI) was synthesized and used for the preparation of a series of high performance donor-acceptor small molecules and polymers. The resulting materials possess low-lying LUMO energy level and small HOMO-LUMO gaps, while their films exhibited high crystallinity upon thermal treatment, commensurate with high field effect mobilities and ambipolar transfer characteristics.

  8. Bay-annulated indigo (BAI) as an excellent electron accepting building block for high performance organic semiconductors

    DOEpatents

    Liu, Yi; He, Bo; Pun, Andrew

    2016-04-19

    A novel electron acceptor based on bay-annulated indigo (BAI) was synthesized and used for the preparation of a series of high performance donor-acceptor small molecules and polymers. The resulting materials possess low-lying LUMO energy level and small HOMO-LUMO gaps, while their films exhibited high crystallinity upon thermal treatment, commensurate with high field effect mobilities and ambipolar transfer characteristics.

  9. Critical review of glass performance modeling

    SciTech Connect

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  10. Examining the Intention to Use Technology among Pre-Service Teachers: An Integration of the Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Teo, Timothy

    2012-01-01

    This study examined pre-service teachers' self-reported intention to use technology. One hundred fifty-seven participants completed a survey questionnaire measuring their responses to six constructs from a research model that integrated the Technology Acceptance Model (TAM) and Theory of Planned Behavior (TPB). Structural equation modeling was…

  11. A statistical model for predicting muscle performance

    NASA Astrophysics Data System (ADS)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  12. Estuarine modeling: Does a higher grid resolution improve model performance?

    EPA Science Inventory

    Ecological models are useful tools to explore cause effect relationships, test hypothesis and perform management scenarios. A mathematical model, the Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to the Louisiana continental shelf of the northern ...

  13. Performance Benchmarking Tsunami Models for NTHMP's Inundation Mapping Activities

    NASA Astrophysics Data System (ADS)

    Horrillo, Juan; Grilli, Stéphan T.; Nicolsky, Dmitry; Roeber, Volker; Zhang, Joseph

    2015-03-01

    The coastal states and territories of the United States (US) are vulnerable to devastating tsunamis from near-field or far-field coseismic and underwater/subaerial landslide sources. Following the catastrophic 2004 Indian Ocean tsunami, the National Tsunami Hazard Mitigation Program (NTHMP) accelerated the development of public safety products for the mitigation of these hazards. In response to this initiative, US coastal states and territories speeded up the process of developing/enhancing/adopting tsunami models that can be used for developing inundation maps and evacuation plans. One of NTHMP's requirements is that all operational and inundation-based numerical (O&I) models used for such purposes be properly validated against established standards to ensure the reliability of tsunami inundation maps as well as to achieve a basic level of consistency between parallel efforts. The validation of several O&I models was considered during a workshop held in 2011 at Texas A&M University (Galveston). This validation was performed based on the existing standard (OAR-PMEL-135), which provides a list of benchmark problems (BPs) covering various tsunami processes that models must meet to be deemed acceptable. Here, we summarize key approaches followed, results, and conclusions of the workshop. Eight distinct tsunami models were validated and cross-compared by using a subset of the BPs listed in the OAR-PMEL-135 standard. Of the several BPs available, only two based on laboratory experiments are detailed here for sake of brevity; since they are considered as sufficiently comprehensive. Average relative errors associated with expected parameters values such as maximum surface amplitude/runup are estimated. The level of agreement with the reference data, reasons for discrepancies between model results, and some of the limitations are discussed. In general, dispersive models were found to perform better than nondispersive models, but differences were relatively small, in part

  14. Modeling Windows in Energy Plus with Simple Performance Indices

    SciTech Connect

    Arasteh, Dariush; Kohler, Christian; Griffith, Brent

    2009-10-12

    The building energy simulation program, Energy Plus (E+), cannot use standard window performance indices (U, SHGC, VT) to model window energy impacts. Rather, E+ uses more accurate methods which require a physical description of the window. E+ needs to be able to accept U and SHGC indices as window descriptors because, often, these are all that is known about a window and because building codes, standards, and voluntary programs are developed using these terms. This paper outlines a procedure, developed for E+, which will allow it to use standard window performance indices to model window energy impacts. In this 'Block' model, a given U, SHGC, VT are mapped to the properties of a fictitious 'layer' in E+. For thermal conductance calculations, the 'Block' functions as a single solid layer. For solar optical calculations, the model begins by defining a solar transmittance (Ts) at normal incidence based on the SHGC. For properties at non-normal incidence angles, the 'Block' takes on the angular properties of multiple glazing layers; the number and type of layers defined by the U and SHGC. While this procedure is specific to E+, parts of it may have applicability to other window/building simulation programs.

  15. PV performance modeling workshop summary report.

    SciTech Connect

    Stein, Joshua S.; Tasca, Coryne Adelle; Cameron, Christopher P.

    2011-05-01

    During the development of a solar photovoltaic (PV) energy project, predicting expected energy production from a system is a key part of understanding system value. System energy production is a function of the system design and location, the mounting configuration, the power conversion system, and the module technology, as well as the solar resource. Even if all other variables are held constant, annual energy yield (kWh/kWp) will vary among module technologies because of differences in response to low-light levels and temperature. A number of PV system performance models have been developed and are in use, but little has been published on validation of these models or the accuracy and uncertainty of their output. With support from the U.S. Department of Energy's Solar Energy Technologies Program, Sandia National Laboratories organized a PV Performance Modeling Workshop in Albuquerque, New Mexico, September 22-23, 2010. The workshop was intended to address the current state of PV system models, develop a path forward for establishing best practices on PV system performance modeling, and set the stage for standardization of testing and validation procedures for models and input parameters. This report summarizes discussions and presentations from the workshop, as well as examines opportunities for collaborative efforts to develop objective comparisons between models and across sites and applications.

  16. Factors That Influence the Acceptance of Telemetry by Emergency Medical Technicians in Ambulances: An Application of the Extended Technology Acceptance Model

    PubMed Central

    Hwang, Ji Young; Kim, Ki Young

    2014-01-01

    Abstract Objective: The aim of the study was to verify the effects of patient factors perceived by emergency medical technicians (EMTs) as well as their social and organizational factors on prehospital telemetry use intention based on the technology use intention and elaboration likelihood models. Materials and Methods: This is a retrospective empirical study. Questionnaires were developed on the basis of clinical factors of 72,907 patients assessed by prehospital telemetry from January 1, 2009 to April 30, 2012 by reviewing their prehospital medical care records and in-hospital medical records. Questionnaires regarding the social and organizational factors of EMTs were created on the basis of a literature review. To verify which factors affect the utilization of telemetry, we developed a partial least-squares route model on the basis of each characteristic. In total, 136 EMTs who had experience in using prehospital telemetry were surveyed from April 1 to April 7, 2013. Reliability, validity, hypotheses, and the model goodness of fit of the study tools were tested. Results: The clinical factors of the patients (path coefficient=−0.12; t=2.38), subjective norm (path coefficient=0.18; t=2.63), and job fit (path coefficient=0.45; t=5.29) positively affected the perceived usefulness (p<0.010). Meanwhile, the clinical factors of the patients (path coefficients=−0.19; t=4.46), subjective norm (path coefficient=0.08; t=1.97), loyalty incentives (path coefficient=−0.17; t=3.83), job fit (path coefficient=−0.32; t=7.06), organizational facilitations (path coefficient=0.08; t=1.99), and technical factors (i.e., usefulness and ease of use) positively affected attitudes (path coefficient=0.10, 0.58; t=2.62, 5.81; p<0.010). Attitudes and perceived usefulness significantly positively affected use intention. Conclusions: Factors that influence the use of telemetry by EMTs in ambulances included patients' clinical factors, as well as complex organizational and

  17. Attributing spatial patterns of hydrological model performance

    NASA Astrophysics Data System (ADS)

    Eisner, S.; Malsy, M.; Flörke, M.

    2013-12-01

    Global hydrological models and land surface models are used to understand and simulate the global terrestrial water cycle. They are, in particular, applied to assess the current state of global water resources, to identify anthropogenic pressures on the global water system, and to assess impacts of global and climate change on water resources. Especially in data-scarce regions, the growing availability of remote sensing products, e.g. GRACE estimates of changes in terrestrial water storage, evaporation or soil moisture estimates, has added valuable information to force and constrain these models as they facilitate the calibration and validation of simulated states and fluxes other than stream flow at large spatial scales. Nevertheless, observed discharge records provide important evidence to evaluate the quality of water availability estimates and to quantify the uncertainty associated with these estimates. Most large scale modelling approaches are constrained by simplified physical process representations and they implicitly rely on the assumption that the same model structure is valid and can be applied globally. It is therefore important to understand why large scale hydrological models perform good or poor in reproducing observed runoff and discharge fields in certain regions, and to explore and explain spatial patterns of model performance. We present an extensive evaluation of the global water model WaterGAP (Water - Global Assessment and Prognosis) to simulate 20th century discharges. The WaterGAP modeling framework comprises a hydrology model and several water use models and operates in its current version, WaterGAP3, on a 5 arc minute global grid and . Runoff generated on the individual grid cells is routed along a global drainage direction map taking into account retention in natural surface water bodies, i.e. lakes and wetlands, as well as anthropogenic impacts, i.e. flow regulation and water abstraction for agriculture, industry and domestic purposes as

  18. User acceptance of mobile commerce: an empirical study in Macau

    NASA Astrophysics Data System (ADS)

    Lai, Ivan K. W.; Lai, Donny C. F.

    2014-06-01

    This study aims to examine the positive and negative factors that can significantly explain user acceptance of mobile commerce (m-commerce) in Macau. A technology acceptance model for m-commerce with five factors is constructed. The proposed model is tested using data collected from 219 respondents. Confirmatory factor analysis is performed to examine the reliability and validity of the model, and structural equation modelling is performed to access the relationship between behaviour intention and each factor. The acceptance of m-commerce is influenced by factors including performance expectancy, social influence, facilitating conditions and privacy concern; while effort expectancy is insignificant in this case. The results of the study are useful for m-commerce service providers to adjust their strategies for promoting m-commerce services. This study contributes to the practice by providing a user technology acceptance model for m-commerce that can be used as a foundation for future research.

  19. Cross-industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  20. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  1. Electrochemical Lithium Ion Battery Performance Model

    2007-03-29

    The Electrochemical Lithium Ion Battery Performance Model allows for the computer prediction of the basic thermal, electrical, and electrochemical performance of a lithium ion cell with simplified geometry. The model solves governing equations describing the movement of lithium ions within and between the negative and positive electrodes. The governing equations were first formulated by Fuller, Doyle, and Newman and published in J. Electrochemical Society in 1994. The present model solves the partial differential equations governingmore » charge transfer kinetics and charge, species, heat transports in a computationally-efficient manner using the finite volume method, with special consideration given for solving the model under conditions of applied current, voltage, power, and load resistance.« less

  2. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  3. Hierarchical Model Validation of Symbolic Performance Models of Scientific Kernels

    SciTech Connect

    Alam, Sadaf R; Vetter, Jeffrey S

    2006-08-01

    Multi-resolution validation of hierarchical performance models of scientific applications is critical primarily for two reasons. First, the step-by-step validation determines the correctness of all essential components or phases in a science simulation. Second, a model that is validated at multiple resolution levels is the very first step to generate predictive performance models, for not only existing systems but also for emerging systems and future problem sizes. We present the design and validation of hierarchical performance models of two scientific benchmarks using a new technique called the modeling assertions (MA). Our MA prototype framework generates symbolic performance models that can be evaluated efficiently by generating the equivalent model representations in Octave and MATLAB. The multi-resolution modeling and validation is conducted on two contemporary, massively-parallel systems, XT3 and Blue Gene/L system. The workload distribution and the growth rates predictions generated by the MA models are confirmed by the experimental data collected on the MPP platforms. In addition, the physical memory requirements that are generated by the MA models are verified by the runtime values on the Blue Gene/L system, which has 512 MBytes and 256 MBytes physical memory capacity in its two unique execution modes.

  4. Pain acceptance and personal control in pain relief in two maternity care models: a cross-national comparison of Belgium and the Netherlands

    PubMed Central

    2010-01-01

    Background A cross-national comparison of Belgian and Dutch childbearing women allows us to gain insight into the relative importance of pain acceptance and personal control in pain relief in 2 maternity care models. Although Belgium and the Netherlands are neighbouring countries sharing the same language, political system and geography, they are characterised by a different organisation of health care, particularly in maternity care. In Belgium the medical risks of childbirth are emphasised but neutralised by a strong belief in the merits of the medical model. Labour pain is perceived as a needless inconvenience easily resolved by means of pain medication. In the Netherlands the midwifery model of care defines childbirth as a normal physiological process and family event. Labour pain is perceived as an ally in the birth process. Methods Women were invited to participate in the study by independent midwives and obstetricians during antenatal visits in 2004-2005. Two questionnaires were filled out by 611 women, one at 30 weeks of pregnancy and one within the first 2 weeks after childbirth either at home or in a hospital. However, only women having a hospital birth without obstetric intervention (N = 327) were included in this analysis. A logistic regression analysis has been performed. Results Labour pain acceptance and personal control in pain relief render pain medication use during labour less likely, especially if they occur together. Apart from this general result, we also find large country differences. Dutch women with a normal hospital birth are six times less likely to use pain medication during labour, compared to their Belgian counterparts. This country difference cannot be explained by labour pain acceptance, since - in contrast to our working hypothesis - Dutch and Belgian women giving birth in a hospital setting are characterised by a similar labour pain acceptance. Our findings suggest that personal control in pain relief can partially explain the

  5. Measuring the Performance of Neural Models.

    PubMed

    Schoppe, Oliver; Harper, Nicol S; Willmore, Ben D B; King, Andrew J; Schnupp, Jan W H

    2016-01-01

    Good metrics of the performance of a statistical or computational model are essential for model comparison and selection. Here, we address the design of performance metrics for models that aim to predict neural responses to sensory inputs. This is particularly difficult because the responses of sensory neurons are inherently variable, even in response to repeated presentations of identical stimuli. In this situation, standard metrics (such as the correlation coefficient) fail because they do not distinguish between explainable variance (the part of the neural response that is systematically dependent on the stimulus) and response variability (the part of the neural response that is not systematically dependent on the stimulus, and cannot be explained by modeling the stimulus-response relationship). As a result, models which perfectly describe the systematic stimulus-response relationship may appear to perform poorly. Two metrics have previously been proposed which account for this inherent variability: Signal Power Explained (SPE, Sahani and Linden, 2003), and the normalized correlation coefficient (CC norm , Hsu et al., 2004). Here, we analyze these metrics, and show that they are intimately related. However, SPE has no lower bound, and we show that, even for good models, SPE can yield negative values that are difficult to interpret. CC norm is better behaved in that it is effectively bounded between -1 and 1, and values below zero are very rare in practice and easy to interpret. However, it was hitherto not possible to calculate CC norm directly; instead, it was estimated using imprecise and laborious resampling techniques. Here, we identify a new approach that can calculate CC norm quickly and accurately. As a result, we argue that it is now a better choice of metric than SPE to accurately evaluate the performance of neural models. PMID:26903851

  6. Measuring the Performance of Neural Models

    PubMed Central

    Schoppe, Oliver; Harper, Nicol S.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.

    2016-01-01

    Good metrics of the performance of a statistical or computational model are essential for model comparison and selection. Here, we address the design of performance metrics for models that aim to predict neural responses to sensory inputs. This is particularly difficult because the responses of sensory neurons are inherently variable, even in response to repeated presentations of identical stimuli. In this situation, standard metrics (such as the correlation coefficient) fail because they do not distinguish between explainable variance (the part of the neural response that is systematically dependent on the stimulus) and response variability (the part of the neural response that is not systematically dependent on the stimulus, and cannot be explained by modeling the stimulus-response relationship). As a result, models which perfectly describe the systematic stimulus-response relationship may appear to perform poorly. Two metrics have previously been proposed which account for this inherent variability: Signal Power Explained (SPE, Sahani and Linden, 2003), and the normalized correlation coefficient (CCnorm, Hsu et al., 2004). Here, we analyze these metrics, and show that they are intimately related. However, SPE has no lower bound, and we show that, even for good models, SPE can yield negative values that are difficult to interpret. CCnorm is better behaved in that it is effectively bounded between −1 and 1, and values below zero are very rare in practice and easy to interpret. However, it was hitherto not possible to calculate CCnorm directly; instead, it was estimated using imprecise and laborious resampling techniques. Here, we identify a new approach that can calculate CCnorm quickly and accurately. As a result, we argue that it is now a better choice of metric than SPE to accurately evaluate the performance of neural models. PMID:26903851

  7. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  8. Computer modeling of heat pipe performance

    NASA Technical Reports Server (NTRS)

    Peterson, G. P.

    1983-01-01

    A parametric study of the defining equations which govern the steady state operational characteristics of the Grumman monogroove dual passage heat pipe is presented. These defining equations are combined to develop a mathematical model which describes and predicts the operational and performance capabilities of a specific heat pipe given the necessary physical characteristics and working fluid. Included is a brief review of the current literature, a discussion of the governing equations, and a description of both the mathematical and computer model. Final results of preliminary test runs of the model are presented and compared with experimental tests on actual prototypes.

  9. Climate Modeling using High-Performance Computing

    SciTech Connect

    Mirin, A A

    2007-02-05

    The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

  10. Optical Storage Performance Modeling and Evaluation.

    ERIC Educational Resources Information Center

    Behera, Bailochan; Singh, Harpreet

    1990-01-01

    Evaluates different types of storage media for long-term archival storage of large amounts of data. Existing storage media are reviewed, including optical disks, optical tape, magnetic storage, and microfilm; three models are proposed based on document storage requirements; performance analysis is considered; and cost effectiveness is discussed.…

  11. Acceptance speech.

    PubMed

    Carpenter, M

    1994-01-01

    In Bangladesh, the assistant administrator of USAID gave an acceptance speech at an awards ceremony on the occasion of the 25th anniversary of oral rehydration solution (ORS). The ceremony celebrated the key role of the International Centre for Diarrhoeal Disease Research, Bangladesh (ICDDR,B) in the discovery of ORS. Its research activities over the last 25 years have brought ORS to every village in the world, preventing more than a million deaths each year. ORS is the most important medical advance of the 20th century. It is affordable and client-oriented, a true appropriate technology. USAID has provided more than US$ 40 million to ICDDR,B for diarrheal disease and measles research, urban and rural applied family planning and maternal and child health research, and vaccine development. ICDDR,B began as the relatively small Cholera Research Laboratory and has grown into an acclaimed international center for health, family planning, and population research. It leads the world in diarrheal disease research. ICDDR,B is the leading center for applied health research in South Asia. It trains public health specialists from around the world. The government of Bangladesh and the international donor community have actively joined in support of ICDDR,B. The government applies the results of ICDDR,B research to its programs to improve the health and well-being of Bangladeshis. ICDDR,B now also studies acute respiratory diseases and measles. Population and health comprise 1 of USAID's 4 strategic priorities, the others being economic growth, environment, and democracy, USAID promotes people's participation in these 4 areas and in the design and implementation of development projects. USAID is committed to the use and improvement of ORS and to complementary strategies that further reduce diarrhea-related deaths. Continued collaboration with a strong user perspective and integrated services will lead to sustainable development. PMID:12345470

  12. Acceptance speech.

    PubMed

    Yusuf, C K

    1994-01-01

    I am proud and honored to accept this award on behalf of the Government of Bangladesh, and the millions of Bangladeshi children saved by oral rehydration solution. The Government of Bangladesh is grateful for this recognition of its commitment to international health and population research and cost-effective health care for all. The Government of Bangladesh has already made remarkable strides forward in the health and population sector, and this was recognized in UNICEF's 1993 "State of the World's Children". The national contraceptive prevalence rate, at 40%, is higher than that of many developed countries. It is appropriate that Bangladesh, where ORS was discovered, has the largest ORS production capacity in the world. It was remarkable that after the devastating cyclone in 1991, the country was able to produce enough ORS to meet the needs and remain self-sufficient. Similarly, Bangladesh has one of the most effective, flexible and efficient control of diarrheal disease and epidemic response program in the world. Through the country, doctors have been trained in diarrheal disease management, and stores of ORS are maintained ready for any outbreak. Despite grim predictions after the 1991 cyclone and the 1993 floods, relatively few people died from diarrheal disease. This is indicative of the strength of the national program. I want to take this opportunity to acknowledge the contribution of ICDDR, B and the important role it plays in supporting the Government's efforts in the health and population sector. The partnership between the Government of Bangladesh and ICDDR, B has already borne great fruit, and I hope and believe that it will continue to do so for many years in the future. Thank you. PMID:12345479

  13. Health research access to personal confidential data in England and Wales: assessing any gap in public attitude between preferable and acceptable models of consent.

    PubMed

    Taylor, Mark J; Taylor, Natasha

    2014-12-01

    England and Wales are moving toward a model of 'opt out' for use of personal confidential data in health research. Existing research does not make clear how acceptable this move is to the public. While people are typically supportive of health research, when asked to describe the ideal level of control there is a marked lack of consensus over the preferred model of consent (e.g. explicit consent, opt out etc.). This study sought to investigate a relatively unexplored difference between the consent model that people prefer and that which they are willing to accept. It also sought to explore any reasons for such acceptance.A mixed methods approach was used to gather data, incorporating a structured questionnaire and in-depth focus group discussions led by an external facilitator. The sampling strategy was designed to recruit people with different involvement in the NHS but typically with experience of NHS services. Three separate focus groups were carried out over three consecutive days.The central finding is that people are typically willing to accept models of consent other than that which they would prefer. Such acceptance is typically conditional upon a number of factors, including: security and confidentiality, no inappropriate commercialisation or detrimental use, transparency, independent overview, the ability to object to any processing considered to be inappropriate or particularly sensitive.This study suggests that most people would find research use without the possibility of objection to be unacceptable. However, the study also suggests that people who would prefer to be asked explicitly before data were used for purposes beyond direct care may be willing to accept an opt out model of consent if the reasons for not seeking explicit consent are accessible to them and they trust that data is only going to be used under conditions, and with safeguards, that they would consider to be acceptable even if not preferable. PMID:26085451

  14. Performance model of molten carbonate fuel cell

    SciTech Connect

    Matsumoto, S.; Sasaki, A.; Urushibata, H.; Tanaka, T. )

    1990-06-01

    A performance model of a molten carbonate fuel cell (MCFC), that is an electrochemical energy conversion device for electric power generation, is discussed. The authors' purpose is to improve the presumptive ability of the MCFC model and to investigate the impact of MCFC characteristics in fuel cell system simulations. Basic data are obtained experimentally by single-cell tests. The authors pay special attention to the MCFC overall characteristics with respect to oxidant composition. A correlation formula based on the experimental data is derived as for the cell voltage, oxygen and carbon dioxide partial pressures. After three types of the MCFC system option are assumed, trade-off studies are made dependant on the performance models.

  15. Testing a model for parental acceptance of human papillomavirus vaccine in 9- to 18-year-old girls: a theory-guided study.

    PubMed

    Reynolds, Diane; O'Connell, Kathleen A

    2012-12-01

    Gardasil is the first vaccine developed to prevent cervical cancer and other diseases caused by certain types of genital human papillomavirus in females, but little is known about parental acceptance of this vaccine. The purpose of this study was to test a model that predicts intention to vaccinate that includes constructs from the health belief model and the theory of reasoned action. PMID:22020360

  16. High temperature furnace modeling and performance verifications

    NASA Technical Reports Server (NTRS)

    Smith, James E., Jr.

    1992-01-01

    Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.

  17. Laboratories for the 21st Century: Best Practices; Modeling Exhaust Dispersion for Specifying Acceptable Exhaust/Intake Design (Brochure)

    SciTech Connect

    Not Available

    2011-09-01

    This guide provides general information on specifying acceptable exhaust and intake designs. It also provides various quantitative approaches that can be used to determine expected concentration levels resulting from exhaust system emissions. In addition, the guide describes methodologies that can be employed to operate laboratory exhaust systems in a safe and energy efficient manner by using variable air volume (VAV) technology. The guide, one in a series on best practices for laboratories, was produced by Laboratories for the 21st Century (Labs21), a joint program of the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE). Geared toward architects, engineers, and facility managers, the guides contain information about technologies and practices to use in designing, constructing, and operating safe, sustainable, high-performance laboratories. Studies show a direct relationship between indoor air quality and the health and productivity of building occupants. Historically, the study and protection of indoor air quality focused on emission sources emanating from within the building. For example, to ensure that the worker is not exposed to toxic chemicals, 'as manufactured' and 'as installed' containment specifications are required for fume hoods. However, emissions from external sources, which may be re-ingested into the building through closed circuiting between the building's exhaust stacks and air intakes, are an often overlooked aspect of indoor air quality.

  18. Performance comparison for Barnes model 12-1000, Exotech model 100, and Ideas Inc. Biometer Mark 2

    NASA Technical Reports Server (NTRS)

    Robinson, B. (Principal Investigator)

    1981-01-01

    Results of tests show that all channels of all instruments, except channel 3 of the Biometer Mark 2, were stable in response to input signals were linear, and were adequately stable in response to temperature changes. The Biometer Mark 2 is labelled with an inappropriate description of the units measured and the dynamic range is a inappropriate for field measurements causing unnecessarily high fractional errors. This instrument is, therefore, quantization limited. The dynamic range and noise performance of the Model 12-1000 are appropriate for remote sensing field research. The field of view and performance of the Model 100A and the Model 12-1000 are satisfactory. The Biometer Mark 2 has not, as yet, been satisfactorily equipped with an acceptable field of view determining device. Neither the widely used aperture plate nor the 24 deg cone are acceptable.

  19. Modeling Topaz-II system performance

    SciTech Connect

    Lee, H.H.; Klein, A.C. )

    1993-01-01

    The US acquisition of the Topaz-11 in-core thermionic space reactor test system from Russia provides a good opportunity to perform a comparison of the Russian reported data and the results from computer codes such as MCNP (Ref. 3) and TFEHX (Ref. 4). The comparison study includes both neutronic and thermionic performance analyses. The Topaz II thermionic reactor is modeled with MCNP using actual Russian dimensions and parameters. The computation of the neutronic performance considers several important aspects such as the fuel enrichment and location of the thermionic fuel elements (TFES) in the reactor core. The neutronic analysis included the calculation of both radial and axial power distribution, which are then used in the TFEHX code for electrical performance. The reactor modeled consists of 37 single-cell TFEs distributed in a 13-cm-radius zirconium hydride block surrounded by 8 cm of beryllium metal reflector. The TFEs use 90% enriched [sup 235]U and molybdenum coated with a thin layer of [sup 184]W for emitter surface. Electrons emitted are captured by a collector surface with a gap filled with cesium vapor between the collector and emitter surfaces. The collector surface is electrically insulated with alumina. Liquid NaK provides the cooling system for the TFEs. The axial thermal power distribution is obtained by dividing the TFE into 40 axial nodes. Comparison of the true axial power distribution with that produced by electrical heaters was also performed.

  20. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  1. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    SciTech Connect

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  2. A Patient Survey Assessing the Awareness and Acceptability of the Emergency Care Summary and Its Consent Model in Scotland

    PubMed Central

    Johnstone, Chris; McCartney, Gerry

    2010-01-01

    Background The Emergency Care Summary (ECS) was introduced in 2006 to allow aspects of the general practitioner (GP; family doctor, equivalent to primary care physician) medical record to be viewed in hospitals and out-of-hours centers in Scotland. Records were automatically uploaded unless patients actively opted out. This study investigated patient awareness and acceptance of this process. Methods This was a questionnaire survey of patients in a GP surgery (office) in Paisley, Scotland. Results Survey results indicated that 42 percent of patients were aware of the ECS, and 16 percent said that they recognized the leaflet posted to households. Of those who recognized the leaflet, 92 percent said they were happy for their record to be part of the system, while the others did not realize their record was to be included. Having read the leaflet, 97 percent said that they were happy for their record to be included in the ECS. Conclusions This study shows that most patients were not aware of the Emergency Care Summary or did not remember seeing the leaflet posted to households. Having read the leaflet, the vast majority of patients were happy for their records to be included in the system. The low awareness of the ECS calls into question the validity of an implied consent model using an information leaflet distributed by post. PMID:20697469

  3. Acceptance and Commitment Therapy and Contextual Behavioral Science: Examining the Progress of a Distinctive Model of Behavioral and Cognitive Therapy

    PubMed Central

    Hayes, Steven C.; Levin, Michael E.; Plumb-Vilardaga, Jennifer; Villatte, Jennifer L.; Pistorello, Jacqueline

    2012-01-01

    A number of recent authors have compared acceptance and commitment therapy (ACT) and traditional cognitive behavior therapy (CBT). The present article describes ACT as a distinct and unified model of behavior change, linked to a specific strategy of scientific development, which we term “contextual behavioral science.” We outline the empirical progress of ACT and describe its distinctive development strategy. A contextual behavioral science approach is an inductive attempt to build more adequate psychological systems based on philosophical clarity; the development of basic principles and theories; the development of applied theories linked to basic ones; techniques and components linked to these processes and principles; measurement of theoretically key processes; an emphasis on mediation and moderation in the analysis of applied impact; an interest in effectiveness, dissemination, and training; empirical testing of the research program across a broad range of areas and levels of analysis; and the creation of a more effective scientific and clinical community. We argue that this is a reasonable approach, focused on long-term progress, and that in broad terms it seems to be working. ACT is not hostile to traditional CBT, and is not directly buoyed by whatever weaknesses traditional CBT may have. ACT should be measured at least in part against its own goals as specified by its own developmental strategy. PMID:23611068

  4. Quantitative and qualitative variation of fat in model vanilla custard desserts: effects on sensory properties and consumer acceptance.

    PubMed

    Tomaschunas, Maja; Köhn, Ehrhard; Bennwitz, Petra; Hinrichs, Jörg; Busch-Stockfisch, Mechthild

    2013-06-01

    The effects of variation in fat content (0.1% to 15.8%) and type of fat, using different types of milk, dairy cream, or vegetable fat cream, on sensory characteristics and consumer acceptance of starch-based vanilla model custards were studied. Descriptive analysis with trained panelists and consumer testing with untrained assessors were applied. Descriptive data were related to hedonic data using principal component analysis to determine drivers of liking and disliking. Results demonstrated an increasing effect of fat concerning visual and oral thickness, creamy flavor, and fat-related texture properties, as well as a decreasing effect concerning yellow color and surface shine. A lack of fat caused moderate intensities in pudding-like flavor attributes and an intensive jelly texture. Adding a vegetable fat cream led to lower intensities in attributes yellow color, cooked flavor, thick, and jelly texture, whereas intensities in vegetable fat flavor and fat-related texture properties increased. All consumers favored custards with medium fat contents, being high in pudding-like and vegetable fat flavor as well as in fat-related texture attributes. Nonfat custards were rejected due to jelly texture and moderate intensities in pudding-flavor attributes. High-fat samples were liked by some consumers, but their high intensities in thickness, white color, and creamy flavor also drove disliking for others. PMID:23772708

  5. Human visual performance model for crewstation design

    NASA Technical Reports Server (NTRS)

    Larimer, James; Prevost, Michael; Arditi, Aries; Azueta, Steven; Bergen, James; Lubin, Jeffrey

    1991-01-01

    An account is given of a Visibility Modeling Tool (VMT) which furnishes a crew-station designer with the means to assess configurational tradeoffs, with a view to the impact of various options on the unambiguous access of information to the pilot. The interactive interface of the VMT allows the manipulation of cockpit geometry, ambient lighting, pilot ergonomics, and the displayed symbology. Performance data can be displayed in the form of 3D contours into the crewstation graphic model, thereby yielding an indication of the operator's visual capabilities.

  6. Performance Modeling of Experimental Laser Lightcrafts

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.; Turner, Jim (Technical Monitor)

    2001-01-01

    A computational plasma aerodynamics model is developed to study the performance of a laser propelled Lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure-based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibrium thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literatures. The predicted coupling coefficients for the Lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  7. Modeling segmentation performance in NV-IPM

    NASA Astrophysics Data System (ADS)

    Lies, Micah J.; Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Imaging sensors produce images whose primary use is to convey information to human operators. However, their proliferation has resulted in an overload of information. As a result, computational algorithms are being increasingly implemented to simplify an operator's task or to eliminate the human operator altogether. Predicting the effect of algorithms on task performance is currently cumbersome requiring estimates of the effects of an algorithm on the blurring and noise, and "shoe-horning" these effects into existing models. With the increasing use of automated algorithms with imaging sensors, a fully integrated approach is desired. While specific implementation algorithms differ, general tasks can be identified that form building blocks of a wide range of possible algorithms. Those tasks are segmentation of objects from the spatio-temporal background, object tracking over time, feature extraction, and transformation of features into human usable information. In this paper research is conducted with the purpose of developing a general performance model for segmentation algorithms based on image quality. A database of pristine imagery has been developed in which there is a wide variety of clearly defined regions with respect to shape, size, and inherent contrast. Both synthetic and "natural" images make up the database. Each image is subjected to various amounts of blur and noise. Metrics for the accuracy of segmentation have been developed and measured for each image and segmentation algorithm. Using the computed metric values and the known values of blur and noise, a model of performance for segmentation is being developed. Preliminary results are reported.

  8. Model pump performance program. Data report. [PWR

    SciTech Connect

    Swift, W.L.

    1982-05-01

    A 1/20-scale model of a reactor coolant pump has been tested under single-phase and two-phase flow conditions. Air/water and steam/water mixtures have been used to obtain two-phase pump performance and information about flow regime effects throughout three quadrants of pump operation. This report contains extensive pump performance data from low pressure air/water and high pressure steam/water steady state tests, results from cavitation tests at temperatures from 100/sup 0/F and 420/sup 0/F and results from transient blowdown tests in which flow through the pump was two-phase. The data should be useful for: formulating empirical models of two-phase pump performance, examining scaling relations for two-phase flow in jumps, unifying air/water and steam/water data, determining relationships between steady-state and transient performance of pumps in two-phase flow and developing an understanding of two-phase flow physics in pumps.

  9. Perceptions of a Specific Family Communication Application among Grandparents and Grandchildren: An Extension of the Technology Acceptance Model

    PubMed Central

    Tsai, Tsai-Hsuan; Chang, Hsien-Tsung; Ho, Yi-Lun

    2016-01-01

    Many studies have noted that the use of social networks sites (SNSs) can enhance social interaction among the elderly and that the motivation for the elderly to use SNSs is to keep in contact with remote friends and family or the younger generation. Memotree is designed to promote intergenerational family communication. The system incorporates the Family Tree design concept and provides family communication mechanisms based on the Family Communication Scale. In addition, the system optimizes hardware and interface use to conform to the specific needs of older and substantially younger individuals. Regarding the impact of variables on SNS with respect to the interaction of usability variables in the construction of a cross-generational communication platform, we adopted the TAM model and Chung et al.’s suggestions to promote user acceptance of the proposed Memotree system. A total of 39 grandchildren and 39 grandparents met the criteria and were included in the study. The elderly and young respondents revealed substantial willingness to use and/or satisfaction with using the Memotree system. Empirical results indicate that technology affordances and perceived ease of use have a positive impact on perceived usefulness, while perceived ease of use is affected by technology affordances. Internet self-efficacy and perceived usefulness have a positive impact on the user’s behavioral intention toward the system. In addition, this study investigated age as a moderating variable in the model. The results indicate that grandchildren have a larger significant effect on the path between perceived usefulness and behavioral intention than grandparents. This study proposes a more complete framework for investigating the user’s behavioral intention and provides a more appropriate explanation of related services for cross-generational interaction with SNS services. PMID:27270915

  10. Perceptions of a Specific Family Communication Application among Grandparents and Grandchildren: An Extension of the Technology Acceptance Model.

    PubMed

    Tsai, Tsai-Hsuan; Chang, Hsien-Tsung; Ho, Yi-Lun

    2016-01-01

    Many studies have noted that the use of social networks sites (SNSs) can enhance social interaction among the elderly and that the motivation for the elderly to use SNSs is to keep in contact with remote friends and family or the younger generation. Memotree is designed to promote intergenerational family communication. The system incorporates the Family Tree design concept and provides family communication mechanisms based on the Family Communication Scale. In addition, the system optimizes hardware and interface use to conform to the specific needs of older and substantially younger individuals. Regarding the impact of variables on SNS with respect to the interaction of usability variables in the construction of a cross-generational communication platform, we adopted the TAM model and Chung et al.'s suggestions to promote user acceptance of the proposed Memotree system. A total of 39 grandchildren and 39 grandparents met the criteria and were included in the study. The elderly and young respondents revealed substantial willingness to use and/or satisfaction with using the Memotree system. Empirical results indicate that technology affordances and perceived ease of use have a positive impact on perceived usefulness, while perceived ease of use is affected by technology affordances. Internet self-efficacy and perceived usefulness have a positive impact on the user's behavioral intention toward the system. In addition, this study investigated age as a moderating variable in the model. The results indicate that grandchildren have a larger significant effect on the path between perceived usefulness and behavioral intention than grandparents. This study proposes a more complete framework for investigating the user's behavioral intention and provides a more appropriate explanation of related services for cross-generational interaction with SNS services. PMID:27270915

  11. Hybrid Modeling Improves Health and Performance Monitoring

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.

  12. Computer acceptance of older adults.

    PubMed

    Nägle, Sibylle; Schmidt, Ludger

    2012-01-01

    Even though computers play a massive role in everyday life of modern societies, older adults, and especially older women, are less likely to use a computer, and they perform fewer activities on it than younger adults. To get a better understanding of the factors affecting older adults' intention towards and usage of computers, the Unified Theory of Acceptance and Usage of Technology (UTAUT) was applied as part of a more extensive study with 52 users and non-users of computers, ranging in age from 50 to 90 years. The model covers various aspects of computer usage in old age via four key constructs, namely performance expectancy, effort expectancy, social influences, and facilitating conditions, as well as the variables gender, age, experience, and voluntariness it. Interestingly, next to performance expectancy, facilitating conditions showed the strongest correlation with use as well as with intention. Effort expectancy showed no significant correlation with the intention of older adults to use a computer. PMID:22317258

  13. Modelling fuel cell performance using artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ogaji, S. O. T.; Singh, R.; Pilidis, P.; Diacakis, M.

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed.

  14. Spike Decomposition Technique: Modeling and Performance Tests

    NASA Astrophysics Data System (ADS)

    Nita, Gelu M.; Fleishman, Gregory D.; Gary, Dale E.

    2008-12-01

    We develop an automated technique for fitting the spectral components of solar microwave spike bursts, which are characterized by narrowband spectral features. The algorithm is especially useful for periods when the spikes occur in densely packed clusters, where the algorithm is capable of decomposing overlapping spike structures into individual spectral components. To test the performance and applicability limits of this data reduction tool, we perform comprehensive modeling of spike clusters characterized by various typical bandwidths, spike densities, and amplitude distributions. We find that, for a wide range of favorable combinations of input parameters, the algorithm is able to recover the characteristic features of the modeled distributions within reasonable confidence intervals. Having model-tested the algorithm against spike overlap, broadband spectral background, noise contamination, and possible malfunction of some spectral channels, we apply the technique to a spike cluster recorded by the Chinese Purple Mountain Observatory (PMO) spectrometer, operating above 4.5 GHz. We study the variation of the spike distribution parameters, such as amplitude, bandwidth, and related derived physical parameters, as a function of time. The method can be further applied to observations from other instruments and to other types of fine structures.

  15. Performance Evaluation Modeling of Network Sensors

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Jennings, Esther H.; Gao, Jay L.

    2003-01-01

    Substantial benefits are promised by operating many spatially separated sensors collectively. Such systems are envisioned to consist of sensor nodes that are connected by a communications network. A simulation tool is being developed to evaluate the performance of networked sensor systems, incorporating such metrics as target detection probabilities, false alarms rates, and classification confusion probabilities. The tool will be used to determine configuration impacts associated with such aspects as spatial laydown, and mixture of different types of sensors (acoustic, seismic, imaging, magnetic, RF, etc.), and fusion architecture. The QualNet discrete-event simulation environment serves as the underlying basis for model development and execution. This platform is recognized for its capabilities in efficiently simulating networking among mobile entities that communicate via wireless media. We are extending QualNet's communications modeling constructs to capture the sensing aspects of multi-target sensing (analogous to multiple access communications), unimodal multi-sensing (broadcast), and multi-modal sensing (multiple channels and correlated transmissions). Methods are also being developed for modeling the sensor signal sources (transmitters), signal propagation through the media, and sensors (receivers) that are consistent with the discrete event paradigm needed for performance determination of sensor network systems. This work is supported under the Microsensors Technical Area of the Army Research Laboratory (ARL) Advanced Sensors Collaborative Technology Alliance.

  16. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  17. Modeling the Interrelationships among Pre-Service Science Teachers' Understanding and Acceptance of Evolution, Their Views on Nature of Science and Self-Efficacy Beliefs regarding Teaching Evolution

    ERIC Educational Resources Information Center

    Akyol, Gulsum; Tekkaya, Ceren; Sungur, Semra; Traynor, Anne

    2012-01-01

    This study proposed a path model of relationships among understanding and acceptance of evolution, views on nature of science, and self-efficacy beliefs regarding teaching evolution. A total of 415 pre-service science teachers completed a series of self-report instruments for the specified purpose. After the estimation of scale scores using…

  18. Assessing the Intention to Use Technology among Pre-Service Teachers in Singapore and Malaysia: A Multigroup Invariance Analysis of the Technology Acceptance Model (TAM)

    ERIC Educational Resources Information Center

    Teo, Timothy; Lee, Chwee Beng; Chai, Ching Sing; Wong, Su Luan

    2009-01-01

    This study assesses the pre-service teachers' self-reported future intentions to use technology in Singapore and Malaysia. A survey was employed to validate items from past research. Using the Technology Acceptance Model (TAM) as a research framework, 495 pre-service teachers from Singapore and Malaysia responded to an 11-item questionnaires…

  19. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  20. Computer modeling of thermoelectric generator performance

    NASA Technical Reports Server (NTRS)

    Chmielewski, A. B.; Shields, V.

    1982-01-01

    Features of the DEGRA 2 computer code for simulating the operations of a spacecraft thermoelectric generator are described. The code models the physical processes occurring during operation. Input variables include the thermoelectric couple geometry and composition, the thermoelectric materials' properties, interfaces and insulation in the thermopile, the heat source characteristics, mission trajectory, and generator electrical requirements. Time steps can be specified and sublimation of the leg and hot shoe is accounted for, as are shorts between legs. Calculations are performed for conduction, Peltier, Thomson, and Joule heating, the cold junction can be adjusted for solar radition, and the legs of the thermoelectric couple are segmented to enhance the approximation accuracy. A trial run covering 18 couple modules yielded data with 0.3% accuracy with regard to test data. The model has been successful with selenide materials, SiGe, and SiN4, with output of all critical operational variables.

  1. Evaluating the performance of copula models in phase I-II clinical trials under model misspecification

    PubMed Central

    2014-01-01

    Background Traditionally, phase I oncology trials are designed to determine the maximum tolerated dose (MTD), defined as the highest dose with an acceptable probability of dose limiting toxicities(DLT), of a new treatment via a dose escalation study. An alternate approach is to jointly model toxicity and efficacy and allow dose escalation to depend on a pre-specified efficacy/toxicity tradeoff in a phase I-II design. Several phase I-II trial designs have been discussed in the literature; while these model-based designs are attractive in their performance, they are potentially vulnerable to model misspecification. Methods Phase I-II designs often rely on copula models to specify the joint distribution of toxicity and efficacy, which include an additional correlation parameter that can be difficult to estimate. We compare and contrast three models for the joint probability of toxicity and efficacy, including two copula models that have been proposed for use in phase I-II clinical trials and a simple model that assumes the two outcomes are independent. We evaluate the performance of the various models through simulation both when the models are correct and under model misspecification. Results Both models exhibited similar performance, as measured by the probability of correctly identifying the optimal dose and the number of subjects treated at the optimal dose, regardless of whether the data were generated from the correct or incorrect copula, even when there is substantial correlation between the two outcomes. Similar results were observed for a simple model that assumes independence, even in the presence of strong correlation. Further simulation results indicate that estimating the correlation parameter in copula models is difficult with the sample sizes used in Phase I-II clinical trials. Conclusions Our simulation results indicate that the operating characteristics of phase I-II clinical trials are robust to misspecification of the copula model but that a simple

  2. Duration Model-Based Post-processing for the Performance Improvement of a Keyword Spotting System

    NASA Astrophysics Data System (ADS)

    Lee, Min Ji; Yoon, Jae Sam; Oh, Yoo Rhee; Kim, Hong Kook; Choi, Song Ha; Kim, Ji Woon; Kim, Myeong Bo

    In this paper, we propose a post-processing method based on a duration model to improve the performance of a keyword spotting system. The proposed duration model-based post-processing method is performed after detecting a keyword. To detect the keyword, we first combine a keyword model, a non-keyword model, and a silence model. Using the information on the detected keyword, the proposed post-processing method is then applied to determine whether or not the correct keyword is detected. To this end, we generate the duration model using Gaussian distribution in order to accommodate different duration characteristics of each phoneme. Comparing the performance of the proposed method with those of conventional anti-keyword scoring methods, it is shown that the false acceptance and the false rejection rates are reduced.

  3. Spike Decomposition Technique: Modeling and Performance Tests

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.

    2008-05-01

    We develop an automated technique for fitting the spectral components of solar microwave spike bursts characterized by narrow-band (1-50~MHz) features of 1-10~ms duration, which are thought to be due to Electron-Cyclotron Maser emission. The algorithm is especially useful for periods when the spikes occur in densely packed clusters, where the algorithm is capable of decomposing overlapping spike structures into individual spectral components. To test the performance and applicability limits of this forward fitting algorithm, we perform comprehensive modeling of spike clusters characterized by various typical bandwidths, spike densities, and amplitude distributions. We find that, for a wide range of input parameters, the algorithm is able to recover the characteristic features of the modeled distributions within reasonable confidence intervals. Having model-tested the algorithm comprehensively against spike overlap, broadband spectral background, noise contamination, and possible contamination of cross-channel polarization, we apply the technique to observational data obtained from different instruments in different frequency ranges. Specifically, we studied spike clusters recorded by a Chinese Purple Mountain Observatory (PMO) spectrometer above 4.5 GHz and by Owens Valley Solar Array's FASR Subsystem Testbed instrument above 1 GHz. We study variation of the spike distribution parameters, such as amplitude, bandwidth and related derived physical parameters as a function of frequency and time. We discuss the implications of our results for the choice between competing models of spike generation and underlying physical processes. The method can be further applied to observations from other instruments and to other types of radio spectral fine structures. This work was supported in part by NSF grants AST-0607544 and ATM-0707319 and NASA grant NNG06GJ40G to New Jersey Institute of Technology.

  4. Modeling colloid transport for performance assessment.

    PubMed

    Contardi, J S; Turner, D R; Ahn, T M

    2001-02-01

    The natural system is expected to contribute to isolation at the proposed high-level nuclear waste (HLW) geologic repository at Yucca Mountain, NV (YM). In developing performance assessment (PA) computer models to simulate long-term behavior at YM, colloidal transport of radionuclides has been proposed as a critical factor because of the possible reduced interaction with the geologic media. Site-specific information on the chemistry and natural colloid concentration of saturated zone groundwaters in the vicinity of YM is combined with a surface complexation sorption model to evaluate the impact of natural colloids on calculated retardation factors (RF) for several radioelements of concern in PA. Inclusion of colloids into the conceptual model can reduce the calculated effective retardation significantly. Strongly sorbed radionuclides such as americium and thorium are most affected by pseudocolloid formation and transport, with a potential reduction in RF of several orders of magnitude. Radioelements that are less strongly sorbed under YM conditions, such as uranium and neptunium, are not affected significantly by colloid transport, and transport of plutonium in the valence state is only moderately enhanced. Model results showed no increase in the peak mean annual total effective dose equivalent (TEDE) within a compliance period of 10,000 years, although this is strongly dependent on container life in the base case scenario. At longer times, simulated container failures increase and the TEDE from the colloidal models increased by a factor of 60 from the base case. By using mechanistic models and sensitivity analyses to determine what parameters and transport processes affect the TEDE, colloidal transport in future versions of the TPA code can be represented more accurately. PMID:11288586

  5. Re-examining the role of attitude in information system acceptance: a model from the satisfaction-dissatisfaction perspective

    NASA Astrophysics Data System (ADS)

    Guo, Bin; Zhou, Shasha

    2016-05-01

    This study attempts to re-examine the role of attitude in voluntary information system (IS) acceptance and usage, which has often been discounted in the previous technology acceptance research. We extend the unidimensional view of attitude into a bidimensional one, because of the simultaneous existence of both positive and negative evaluation towards IS in technology acceptance behaviour. In doing so, attitude construct is divided into two components: satisfaction as the positive attitudinal component and dissatisfaction as the negative attitudinal component. We argue that satisfaction and dissatisfaction will interactively affect technology usage intention. Besides, we explore the predictors of satisfaction and dissatisfaction based on the disconfirmation theory. Empirical results from a longitudinal study on bulletin board system (BBS) usage confirm the interaction effect of satisfaction and dissatisfaction on usage intention. Moreover, perceived task-related value has a significant effect on satisfaction, while perceived personal value has a significant effect on dissatisfaction. We also discuss the theoretical and managerial implications of our findings.

  6. Performance model assessment for multi-junction concentrating photovoltaic systems.

    SciTech Connect

    Riley, Daniel M.; McConnell, Robert.; Sahm, Aaron; Crawford, Clark; King, David L.; Cameron, Christopher P.; Foresi, James S.

    2010-03-01

    Four approaches to modeling multi-junction concentrating photovoltaic system performance are assessed by comparing modeled performance to measured performance. Measured weather, irradiance, and system performance data were collected on two systems over a one month period. Residual analysis is used to assess the models and to identify opportunities for model improvement.

  7. A Model of Acceptance of Web 2.0 in Learning in Higher Education: A Case Study of Two Cultures

    ERIC Educational Resources Information Center

    Usoro, Abel; Echeng, Razep; Majewski, Grzegorz

    2014-01-01

    Though a few empirical studies on acceptance of Web 2.0 as a social networking tool in teaching and learning exist, apparently none consider students' and faculties' views from different cultures, which is the focus of this study. This article reports on a pilot study that begins to fill this gap by investigating the perceptions,…

  8. Personal Learning Environments Acceptance Model: The Role of Need for Cognition, e-Learning Satisfaction and Students' Perceptions

    ERIC Educational Resources Information Center

    del Barrio-García, Salvador; Arquero, José L.; Romero-Frías, Esteban

    2015-01-01

    As long as students use Web 2.0 tools extensively for social purposes, there is an opportunity to improve students' engagement in Higher Education by using these tools for academic purposes under a Personal Learning Environment approach (PLE 2.0). The success of these attempts depends upon the reactions and acceptance of users towards e-learning…

  9. Learner Differences in Perceived Satisfaction of an Online Learning: An Extension to the Technology Acceptance Model in an Arabic Sample

    ERIC Educational Resources Information Center

    Al-Azawei, Ahmed; Lundqvist, Karsten

    2015-01-01

    Online learning constitutes the most popular distance-learning method, with flexibility, accessibility, visibility, manageability and availability as its core features. However, current research indicates that its efficacy is not consistent across all learners. This study aimed to modify and extend the factors of the Technology Acceptance Model…

  10. E-Learning and the University of Huelva: A Study of WebCT and the Technological Acceptance Model

    ERIC Educational Resources Information Center

    Sanchez, R. Arteaga; Hueros, A. Duarte; Ordaz, M. Garcia

    2013-01-01

    Purpose: The purpose of this paper is to investigate the factors that determine the acceptance of the WebCT learning system among students of the faculties of Business and Education Sciences at the University of Huelva, and to verify the direct and indirect effects of these factors. Design/methodology/approach: A total of 226 students at the…

  11. DKIST Polarization Modeling and Performance Predictions

    NASA Astrophysics Data System (ADS)

    Harrington, David

    2016-05-01

    Calibrating the Mueller matrices of large aperture telescopes and associated coude instrumentation requires astronomical sources and several modeling assumptions to predict the behavior of the system polarization with field of view, altitude, azimuth and wavelength. The Daniel K Inouye Solar Telescope (DKIST) polarimetric instrumentation requires very high accuracy calibration of a complex coude path with an off-axis f/2 primary mirror, time dependent optical configurations and substantial field of view. Polarization predictions across a diversity of optical configurations, tracking scenarios, slit geometries and vendor coating formulations are critical to both construction and contined operations efforts. Recent daytime sky based polarization calibrations of the 4m AEOS telescope and HiVIS spectropolarimeter on Haleakala have provided system Mueller matrices over full telescope articulation for a 15-reflection coude system. AEOS and HiVIS are a DKIST analog with a many-fold coude optical feed and similar mirror coatings creating 100% polarization cross-talk with altitude, azimuth and wavelength. Polarization modeling predictions using Zemax have successfully matched the altitude-azimuth-wavelength dependence on HiVIS with the few percent amplitude limitations of several instrument artifacts. Polarization predictions for coude beam paths depend greatly on modeling the angle-of-incidence dependences in powered optics and the mirror coating formulations. A 6 month HiVIS daytime sky calibration plan has been analyzed for accuracy under a wide range of sky conditions and data analysis algorithms. Predictions of polarimetric performance for the DKIST first-light instrumentation suite have been created under a range of configurations. These new modeling tools and polarization predictions have substantial impact for the design, fabrication and calibration process in the presence of manufacturing issues, science use-case requirements and ultimate system calibration

  12. Performance of an INTEGRAL spectrometer model

    NASA Technical Reports Server (NTRS)

    Jean, P.; Naya, J. E.; vonBallmoos, P.; Vedrenne, G.; Teegarden, B.

    1997-01-01

    Model calculations for the INTEGRAL spectrometer (SPI) onboard the future INTErnational Gamma Ray Astrophysics Laboratory (INTEGAL) are presented, where the sensitivity for narrow lines is based on estimates of the background level and the detection efficiency. The instrumental background rates are explained as the sum of various components that depend on the cosmic ray intensity and the spectrometer characteristics, such as the mass distribution around the Ge detectors, the passive material, the characteristics of the detector system and the background reduction techniques. Extended background calculations were performed with Monte Carlo simulations and using semi-empirical and calculated neutron and proton cross sections. In order to improve the INTEGRAL spectrometer sensitivity, several designs and background reduction techniques were compared for an instrument with a fixed detector volume.

  13. Testing the Technology Acceptance Model: HIV Case Managers' Intention to Use a Continuity of Care Record with Context-specific Links

    PubMed Central

    Bakken, Suzanne

    2014-01-01

    Objective The goal of this study was to examine the applicability of the Technology Acceptance Model (TAM) in explaining Human Immunodeficiency Virus (HIV) case managers’ acceptance of a prototype Continuity of Care Record (CCR) with context-specific links designed to meet their information needs. Design An online survey, based on the constructs of the Technology Acceptance Model (TAM), of 94 case managers who provide care to persons living with HIV (PLWH). To assess the consistency, reliability and fit of the model factors, three methods were used: principal components factor analysis, Cronbach’s alpha, and regression analysis. Results Principal components factor analysis resulted in three factors (Perceived Ease of Use, Perceived Usefulness, and Barriers to Use) that explained 84.88% of the variance. Internal consistency reliability estimates ranged from .69–.91. In a linear regression model, Perceived Ease of Use, Perceived Usefulness, and Barriers to Use scores explained 43.6% (p <.001) of the variance in Behavioral Intention to use a CCR with context-specific links. Conclusion Our study validated the use of the TAM in health information technology.Results from our study demonstrated that Perceived Ease of Use, Perceived Usefulness, and Barriers to Use are predictors of Behavioral Intention to use a CCR with context-specific links to web-based information resources. PMID:21848452

  14. Performance of an integrated network model

    PubMed Central

    Lehmann, François; Dunn, David; Beaulieu, Marie-Dominique; Brophy, James

    2016-01-01

    Objective To evaluate the changes in accessibility, patients’ care experiences, and quality-of-care indicators following a clinic’s transformation into a fully integrated network clinic. Design Mixed-methods study. Setting Verdun, Que. Participants Data on all patient visits were used, in addition to 2 distinct patient cohorts: 134 patients with chronic illness (ie, diabetes, arteriosclerotic heart disease, or both); and 450 women between the ages of 20 and 70 years. Main outcome measures Accessibility was measured by the number of walk-in visits, scheduled visits, and new patient enrolments. With the first cohort, patients’ care experiences were measured using validated serial questionnaires; and quality-of-care indicators were measured using biologic data. With the second cohort, quality of preventive care was measured using the number of Papanicolaou tests performed as a surrogate marker. Results Despite a negligible increase in the number of physicians, there was an increase in accessibility after the clinic’s transition to an integrated network model. During the first 4 years of operation, the number of scheduled visits more than doubled, nonscheduled visits (walk-in visits) increased by 29%, and enrolment of vulnerable patients (those with chronic illnesses) at the clinic remained high. Patient satisfaction with doctors was rated very highly at all points of time that were evaluated. While the number of Pap tests done did not increase with time, the proportion of patients meeting hemoglobin A1c and low-density lipoprotein guideline target levels increased, as did the number of patients tested for microalbuminuria. Conclusion Transformation to an integrated network model of care led to increased efficiency and enhanced accessibility with no negative effects on the doctor-patient relationship. Improvements in biologic data also suggested better quality of care. PMID:27521410

  15. Baby-Crying Acceptance

    NASA Astrophysics Data System (ADS)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  16. "It's all about acceptance": A qualitative study exploring a model of positive body image for people with spinal cord injury.

    PubMed

    Bailey, K Alysse; Gammage, Kimberley L; van Ingen, Cathy; Ditor, David S

    2015-09-01

    Using modified constructivist grounded theory, the purpose of the present study was to explore positive body image experiences in people with spinal cord injury. Nine participants (five women, four men) varying in age (21-63 years), type of injury (C3-T7; complete and incomplete), and years post-injury (4-36 years) were recruited. The following main categories were found: body acceptance, body appreciation and gratitude, social support, functional gains, independence, media literacy, broadly conceptualizing beauty, inner positivity influencing outer demeanour, finding others who have a positive body image, unconditional acceptance from others, religion/spirituality, listening to and taking care of the body, managing secondary complications, minimizing pain, and respect. Interestingly, there was consistency in positive body image characteristics reported in this study with those found in previous research, demonstrating universality of positive body image. However, unique characteristics (e.g., resilience, functional gains, independence) were also reported demonstrating the importance of exploring positive body image in diverse groups. PMID:26002149

  17. Performance Improvement/HPT Model: Guiding the Process

    ERIC Educational Resources Information Center

    Dessinger, Joan Conway; Moseley, James L.; Van Tiem, Darlene M.

    2012-01-01

    This commentary is part of an ongoing dialogue that began in the October 2011 special issue of "Performance Improvement"--Exploring a Universal Performance Model for HPT: Notes From the Field. The performance improvement/HPT (human performance technology) model represents a unifying process that helps accomplish successful change, create…

  18. New Metacognitive Model for Human Performance Technology

    ERIC Educational Resources Information Center

    Turner, John R.

    2011-01-01

    Addressing metacognitive functions has been shown to improve performance at the individual, team, group, and organizational levels. Metacognition is beginning to surface as an added cognate discipline for the field of human performance technology (HPT). Advances from research in the fields of cognition and metacognition offer a place for HPT to…

  19. Comparison between Utsu's and Vere-Jones' aftershocks model by means of a computer simulation based on the acceptance-rejection sampling of von Neumann

    NASA Astrophysics Data System (ADS)

    Reyes, J.; Morales-Esteban, A.; González, E.; Martínez-Álvarez, F.

    2016-07-01

    In this research, a new algorithm for generating a stochastic earthquake catalog is presented. The algorithm is based on the acceptance-rejection sampling of von Neumann. The result is a computer simulation of earthquakes based on the calculated statistical properties of each zone. Vere-Jones states that an earthquake sequence can be modeled as a series of random events. This is the model used in the proposed simulation. Contrariwise, Utsu indicates that the mainshocks are special geophysical events. The algorithm has been applied to zones of Chile, China, Spain, Japan, and the USA. This allows classifying the zones according to Vere-Jones' or Utsu's model. The results have been quantified relating the mainshock with the largest aftershock within the next 5 days (which has been named as Bath event). The results show that some zones fit Utsu's model and others Vere-Jones'. Finally, the fraction of seismic events that satisfy certain properties of magnitude and occurrence is analyzed.

  20. Information Model for Machine-Tool-Performance Tests

    PubMed Central

    Lee, Y. Tina; Soons, Johannes A.; Donmez, M. Alkan

    2001-01-01

    This report specifies an information model of machine-tool-performance tests in the EXPRESS [1] language. The information model provides a mechanism for describing the properties and results of machine-tool-performance tests. The objective of the information model is a standardized, computer-interpretable representation that allows for efficient archiving and exchange of performance test data throughout the life cycle of the machine. The report also demonstrates the implementation of the information model using three different implementation methods.

  1. Coopersmith Self-Esteem: Two Different Hypothesized Factor Models--Both Acceptable for the Same Data Structure.

    ERIC Educational Resources Information Center

    Hofmann, Rich; Sherman, Larry

    Using data from 135 sixth-, seventh-, and eighth-graders between 11 and 15 years old attending a middle school in a suburban Southwest Ohio school district, two hypothesized models of the factor structures for the Coopersmith Self-Esteem Inventory were tested. One model represents the original Coopersmith factor structure, and the other model is…

  2. Detailed Performance Model for Photovoltaic Systems: Preprint

    SciTech Connect

    Tian, H.; Mancilla-David, F.; Ellis, K.; Muljadi, E.; Jenkins, P.

    2012-07-01

    This paper presents a modified current-voltage relationship for the single diode model. The single-diode model has been derived from the well-known equivalent circuit for a single photovoltaic cell. The modification presented in this paper accounts for both parallel and series connections in an array.

  3. Data Mining of Hydrological Model Performances

    NASA Astrophysics Data System (ADS)

    Vitolo, Claudia; Buytaert, Wouter

    2013-04-01

    Multi-objective criteria have long been used to infer hydrological simulations and fit the natural world. On the other hand, modelling frameworks are also becoming more and more popular as identification of the processes occurring in a catchment is still a very uncertain matter. In theory, multi-objective criteria and multi-model frameworks should be used in combination so that the 'representation' of the catchment is fitted to the observations, not only the simulated results. In practise those approaches are highly computationally demanding. The modeller is often obliged to find a compromise reducing either the number of objective functions or model structures taken into consideration. This compromise is becoming obsolete using parallel computing. In the present study we investigate the extend to which model selection algorithms and regionalisation techniques can be improved by such facilities and highlight the challenges that still need to be addressed. The model simulations are obtained using an ensemble of conceptual lumped models (FUSE by Clark et al. 2008), but techniques and suggestions are of general use and applicable to any modelling frameworks. In particular we developed a novel model selection algorithm tuned to drastically reduce the subjectivity in the analysis. The procedure was automated and coupled with redundancy reduction techniques such as PCA and Cluster Analysis. Results show that the actual model 'representation' has the shape of a set of complementing model structures. It is also possible to capture intra-annum dynamics of the response as the algorithm recognises subtle variations in the selected model structures in different seasons. Similar variations can be found analysing different catchments. This suggests the same methodology would be suitable for analysing spatial patterns in the distribution of suitable model structures and maybe long term dynamics in relation with expedited climate modifications. Although the mentioned methodology

  4. Gender and Acceptance of E-Learning: A Multi-Group Analysis Based on a Structural Equation Model among College Students in Chile and Spain

    PubMed Central

    2015-01-01

    The scope of this study was to evaluate whether the adoption of e-learning in two universities, and in particular, the relationship between the perception of external control and perceived ease of use, is different because of gender differences. The study was carried out with participating students in two different universities, one in Chile and one in Spain. The Technology Acceptance Model was used as a theoretical framework for the study. A multi-group analysis method in partial least squares was employed to relate differences between groups. The four main conclusions of the study are: (1) a version of the Technology Acceptance Model has been successfully used to explain the process of adoption of e-learning at an undergraduate level of study; (2) the finding of a strong and significant relationship between perception of external control and perception of ease of use of the e-learning platform; (3) a significant relationship between perceived enjoyment and perceived ease of use and between results demonstrability and perceived usefulness is found; (4) the study indicates a few statistically significant differences between males and females when adopting an e-learning platform, according to the tested model. PMID:26465895

  5. Gender and Acceptance of E-Learning: A Multi-Group Analysis Based on a Structural Equation Model among College Students in Chile and Spain.

    PubMed

    Ramírez-Correa, Patricio E; Arenas-Gaitán, Jorge; Rondán-Cataluña, F Javier

    2015-01-01

    The scope of this study was to evaluate whether the adoption of e-learning in two universities, and in particular, the relationship between the perception of external control and perceived ease of use, is different because of gender differences. The study was carried out with participating students in two different universities, one in Chile and one in Spain. The Technology Acceptance Model was used as a theoretical framework for the study. A multi-group analysis method in partial least squares was employed to relate differences between groups. The four main conclusions of the study are: (1) a version of the Technology Acceptance Model has been successfully used to explain the process of adoption of e-learning at an undergraduate level of study; (2) the finding of a strong and significant relationship between perception of external control and perception of ease of use of the e-learning platform; (3) a significant relationship between perceived enjoyment and perceived ease of use and between results demonstrability and perceived usefulness is found; (4) the study indicates a few statistically significant differences between males and females when adopting an e-learning platform, according to the tested model. PMID:26465895

  6. Model validation protocol for determining the performance of the terrain-responsive atmospheric code against the Rocky Flats Plant Winter Validation Study

    SciTech Connect

    Hodgin, C.R.; Smith, M.L.

    1992-04-23

    The objective for this Model Validation Protocol is to establish a plan for quantifying the performance (accuracy and precision) of the Terrain-Responsive Atmospheric Code (TRAC) model. The performance will be determined by comparing model predictions against tracer characteristics observed in the free atmosphere. The Protocol will also be applied to other reference'' dispersion models. The performance of the TRAC model will be compared to the performance of these reference models in order to establish TRAC's acceptance for use in applications at the Rocky Flats Plant.

  7. Developing an Energy Performance Modeling Startup Kit

    SciTech Connect

    none,

    2012-10-01

    In 2011, the NAHB Research Center began assessing the needs and motivations of residential remodelers regarding energy performance remodeling. This report outlines: the current remodeling industry and the role of energy efficiency; gaps and barriers to adding energy efficiency into remodeling; and support needs of professional remodelers to increase sales and projects involving improving home energy efficiency.

  8. Performance acceptance test of a portable instrument to detect uranium in water at the DOE Advanced Waste Water Treatment Plant, Fernald, Ohio

    SciTech Connect

    Anderson, M.S.; Weeks, S.J.

    1997-03-28

    The Eppendorf-Biotronik Model IC 2001-2, a portable field ruggedized ion chromatography instrument, was rigorously tested at the DOE Advanced Waste Water Treatment Plant, Fernald, Ohio. This instrument rapidly detected the uranium concentration in water, and has a detection limit in the low ppb range without using the sample concentrating feature. The test set of samples analyzed included: ``Real World`` water samples from the AWWT containing uranium concentrations in the 9--110 ppb range, a sample blank, and a performance evaluation sample. The AWWT samples contained sets of both raw water and acid-preserved water samples. Selected samples were analyzed in quadruplicate to asses the instrument`s precision, and these results were compared with the results from an off-site confirmatory laboratory to assess the instrument`s accuracy. Additional comparisons with on-site laboratory instruments, Chemcheck KPA-11 and Scintrex UA-3 are reported. Overall, the Eppendorf-Biotronik IC 2001-2 performed exceptionally well providing a detection limit in the low ppb region (< 10 ppb) and giving rapid (< 5 minutes) accurate and reproducible analytical results for the AWWT, ``real world``, water samples with uranium concentrations in the region of interest (10--40 ppb). The per sample operating cost for this instrument is equivalent to the per sample cost for the currently used KPA. The time required to analyze a sample and provide a result is approximately the same for the CI 2001-2, KPA, and Scintrex instruments.

  9. Model approach to estimate the probability of accepting a lot of heterogeneously contaminated powdered food using different sampling strategies.

    PubMed

    Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo

    2014-08-01

    Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food. PMID:24462218

  10. Palm: Easing the Burden of Analytical Performance Modeling

    SciTech Connect

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexity (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.

  11. Thermodynamic performance for a chemical reactions model

    NASA Astrophysics Data System (ADS)

    Gonzalez-Narvaez, R. E.; Sánchez-Salas, N.; Chimal-Eguía, J. C.

    2015-01-01

    This paper presents the analysis efficiency of a chemical reaction model of four states, such that their activated states can occur at any point (fixed but arbitrary) of the transition from one state to another. This mechanism operates under a single heat reservoir temperature, unlike the internal combustion engines where there are two thermal sources. Different efficiencies are compared to this model, which operate at different optimum engine regimes. Thus, some analytical methods are used to give an approximate expression, facilitating the comparison between them. Finally, the result is compared with that obtained by other authors considered a general model of an isothermal molecular machine. Taking into account the above, the results seems to follow a similar behaviour for all the optimized engines, which resemble that observed in the case of heat engine efficiencies.

  12. An Empirical Study of a Solo Performance Assessment Model

    ERIC Educational Resources Information Center

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  13. Space Station Freedom electrical performance model

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Green, Robert D.; Kerslake, Thomas W.; Mckissock, David B.; Trudell, Jeffrey J.

    1993-01-01

    The baseline Space Station Freedom electric power system (EPS) employs photovoltaic (PV) arrays and nickel hydrogen (NiH2) batteries to supply power to housekeeping and user electrical loads via a direct current (dc) distribution system. The EPS was originally designed for an operating life of 30 years through orbital replacement of components. As the design and development of the EPS continues, accurate EPS performance predictions are needed to assess design options, operating scenarios, and resource allocations. To meet these needs, NASA Lewis Research Center (LeRC) has, over a 10 year period, developed SPACE (Station Power Analysis for Capability Evaluation), a computer code designed to predict EPS performance. This paper describes SPACE, its functionality, and its capabilities.

  14. Modeling and Performance Simulation of the Mass Storage Network Environment

    NASA Technical Reports Server (NTRS)

    Kim, Chan M.; Sang, Janche

    2000-01-01

    This paper describes the application of modeling and simulation in evaluating and predicting the performance of the mass storage network environment. Network traffic is generated to mimic the realistic pattern of file transfer, electronic mail, and web browsing. The behavior and performance of the mass storage network and a typical client-server Local Area Network (LAN) are investigated by modeling and simulation. Performance characteristics in throughput and delay demonstrate the important role of modeling and simulation in network engineering and capacity planning.

  15. Modeling the Mechanical Performance of Die Casting Dies

    SciTech Connect

    R. Allen Miller

    2004-02-27

    The following report covers work performed at Ohio State on modeling the mechanical performance of dies. The focus of the project was development and particularly verification of finite element techniques used to model and predict displacements and stresses in die casting dies. The work entails a major case study performed with and industrial partner on a production die and laboratory experiments performed at Ohio State.

  16. Performance Modeling: Understanding the Present and Predicting theFuture

    SciTech Connect

    Bailey, David H.; Snavely, Allan

    2005-11-30

    We present an overview of current research in performance modeling, focusing on efforts underway in the Performance Evaluation Research Center (PERC). Using some new techniques, we are able to construct performance models that can be used to project the sustained performance of large-scale scientific programs on different systems, over a range of job and system sizes. Such models can be used by vendors in system designs, by computing centers in system acquisitions, and by application scientists to improve the performance of their codes.

  17. MODIS Solar Diffuser: Modelled and Actual Performance

    NASA Technical Reports Server (NTRS)

    Waluschka, Eugene; Xiong, Xiao-Xiong; Esposito, Joe; Wang, Xin-Dong; Krebs, Carolyn (Technical Monitor)

    2001-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) instrument's solar diffuser is used in its radiometric calibration for the reflective solar bands (VIS, NTR, and SWIR) ranging from 0.41 to 2.1 micron. The sun illuminates the solar diffuser either directly or through a attenuation screen. The attenuation screen consists of a regular array of pin holes. The attenuated illumination pattern on the solar diffuser is not uniform, but consists of a multitude of pin-hole images of the sun. This non-uniform illumination produces small, but noticeable radiometric effects. A description of the computer model used to simulate the effects of the attenuation screen is given and the predictions of the model are compared with actual, on-orbit, calibration measurements.

  18. Using Theoretical Models to Examine the Acceptance Behavior of Mobile Phone Messaging to Enhance Parent-Teacher Interactions

    ERIC Educational Resources Information Center

    Ho, Li-Hsing; Hung, Chang-Liang; Chen, Hui-Chun

    2013-01-01

    Student academic performance and social competence are influenced positively by parent involvement; effective parent-teacher communication not builds parent reliance on a school, it enhances parent knowledge of raising children. As information technology develops rapidly, it is already a trend that e-communication is replacing traditional paper…

  19. Results of model intercomparison : predicted vs. measured system performance.

    SciTech Connect

    Stein, Joshua S.

    2010-10-01

    This is a blind modeling study to illustrate the variability expected between PV performance model results. Objectives are to answer: (1) What is the modeling uncertainty; (2) Do certain models do better than others; (3) How can performance modeling be improved; and (4) What are the sources of uncertainty? Some preliminary conclusions are: (1) Large variation seen in model results; (2) Variation not entirely consistent across systems; (3) Uncertainty in assigning derates; (4) Discomfort when components are not included in database - Is there comfort when the components are in the database?; and (5) Residual analysis will help to uncover additional patterns in the models.

  20. Developing an Energy Performance Modeling Startup Kit

    SciTech Connect

    Wood, A.

    2012-10-01

    In 2011, the NAHB Research Center began the first part of the multi-year effort by assessing the needs and motivations of residential remodelers regarding energy performance remodeling. The scope is multifaceted - all perspectives will be sought related to remodeling firms ranging in size from small-scale, sole proprietor to national. This will allow the Research Center to gain a deeper understanding of the remodeling and energy retrofit business and the needs of contractors when offering energy upgrade services. To determine the gaps and the motivation for energy performance remodeling, the NAHB Research Center conducted (1) an initial series of focus groups with remodelers at the 2011 International Builders' Show, (2) a second series of focus groups with remodelers at the NAHB Research Center in conjunction with the NAHB Spring Board meeting in DC, and (3) quantitative market research with remodelers based on the findings from the focus groups. The goal was threefold, to: Understand the current remodeling industry and the role of energy efficiency; Identify the gaps and barriers to adding energy efficiency into remodeling; and Quantify and prioritize the support needs of professional remodelers to increase sales and projects involving improving home energy efficiency. This report outlines all three of these tasks with remodelers.

  1. Analytical Performance Models for Geologic Repositories

    SciTech Connect

    Chambre, P.L.; Pigford, T.H.; Fujita, A.; Kanki, T.; Kobayashi,A.; Lung, H.; Ting, D.; Sato, Y.; Savoshy, S.J.

    1982-10-01

    This report presents analytical solutions of the dissolution and hydrogeologic transport of radionuclides in geologic repositories. Numerical examples are presented to demonstrate the equations resulting from these analyses. The subjects treated in the present report are: (a) Solubility-limited transport with transverse dispersion (Chapter 2); (b) Transport of a radionuclide chain with nonequilibrium chemical reactions (Chapter 3); (c) Advective transport in a two-dimensional flow field (Chapter 4); (d) Radionuclide.transport in fractured media (Chapter 5); (e) A mathematical model for EPA's analysis of generic repositories (Chapter 6); and (f) Dissolution of radionuclides from solid waste (Chapter 7).

  2. New model performance index for engineering design of control systems

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Performance index includes a model representing linear control-system design specifications. Based on a geometric criterion for approximation of the model by the actual system, the index can be interpreted directly in terms of the desired system response model without actually having the model's time response.

  3. Determination of biogenic amines by high-performance liquid chromatography (HPLC-DAD) in probiotic cow's and goat's fermented milks and acceptance.

    PubMed

    Costa, Marion P; Balthazar, Celso F; Rodrigues, Bruna L; Lazaro, Cesar A; Silva, Adriana C O; Cruz, Adriano G; Conte Junior, Carlos A

    2015-05-01

    This study evaluated the presence of biogenic amines in fermented cow's and goat's milks containing probiotic bacteria, during the first 10 days of chilled storage (4 ± 2°C), when the probiotic strains are most viable. The overall acceptance of both fermented milks, produced using the same starter culture and probiotics, was tested. In both products, the initially high levels of tyramine (560 mg kg(-1) means for both fermented milks), the predominant biogenic amine, increased during the storage period, which may be considered this amine as a quality index for fermented milks. The other principal biogenic amines (putrescine, cadaverine, histamine, and spermidine) were produced on days 1-5 of storage, and thereafter decreased. At the end of the 10th day, these amines, respectively, showed values of fermented cow's milk 20.26, 29.09, 17.97, and 82.07 mg kg(-1); and values of fermented goat's milk 22.92, 29.09, 34.85, and 53.85 mg kg(-1), in fermented cow's and goat's milk. Fermented cow's milk was well accepted compared to fermented goat's milk. The results suggested that the content of biogenic amines may be a criterion for selecting lactic acid bacteria used to produce fermented milks. PMID:25987991

  4. Determination of biogenic amines by high-performance liquid chromatography (HPLC-DAD) in probiotic cow's and goat's fermented milks and acceptance

    PubMed Central

    Costa, Marion P; Balthazar, Celso F; Rodrigues, Bruna L; Lazaro, Cesar A; Silva, Adriana C O; Cruz, Adriano G; Conte Junior, Carlos A

    2015-01-01

    This study evaluated the presence of biogenic amines in fermented cow's and goat's milks containing probiotic bacteria, during the first 10 days of chilled storage (4 ± 2°C), when the probiotic strains are most viable. The overall acceptance of both fermented milks, produced using the same starter culture and probiotics, was tested. In both products, the initially high levels of tyramine (560 mg kg−1 means for both fermented milks), the predominant biogenic amine, increased during the storage period, which may be considered this amine as a quality index for fermented milks. The other principal biogenic amines (putrescine, cadaverine, histamine, and spermidine) were produced on days 1–5 of storage, and thereafter decreased. At the end of the 10th day, these amines, respectively, showed values of fermented cow's milk 20.26, 29.09, 17.97, and 82.07 mg kg−1; and values of fermented goat's milk 22.92, 29.09, 34.85, and 53.85 mg kg−1, in fermented cow's and goat's milk. Fermented cow's milk was well accepted compared to fermented goat's milk. The results suggested that the content of biogenic amines may be a criterion for selecting lactic acid bacteria used to produce fermented milks. PMID:25987991

  5. Performance model assessment for multi-junction concentrating photovoltaic systems.

    SciTech Connect

    Stein, Joshua S.; Riley, Daniel M.; McConnell, Robert.; Sahm, Aaron; Crawford, Clark; King, David L.; Cameron, Christopher P.; Foresi, James S.

    2010-03-01

    Four approaches to modeling multi-junction concentrating photovoltaic system performance are assessed by comparing modeled performance to measured performance. Measured weather, irradiance, and system performance data were collected on two systems over a one month period. Residual analysis is used to assess the models and to identify opportunities for model improvement. Large photovoltaic systems are typically developed as projects which supply electricity to a utility and are owned by independent power producers. Obtaining financing at favorable rates and attracting investors requires confidence in the projected energy yield from the plant. In this paper, various performance models for projecting annual energy yield from Concentrating Photovoltaic (CPV) systems are assessed by comparing measured system output to model predictions based on measured weather and irradiance data. The results are statistically analyzed to identify systematic error sources.

  6. Aspen: A Domain Specific Language for Performance Modeling

    SciTech Connect

    Spafford, Kyle L; Vetter, Jeffrey S

    2012-01-01

    We present a new approach to analytical performance modeling using Aspen, a domain specific language. Aspen (Abstract Scalable Performance Engineering Notation) fills an important gap in existing performance modeling techniques and is designed to enable rapid exploration of new algorithms and architectures. It includes a formal specification of an application's performance behavior and an abstract machine model. We provide an overview of Aspen's features and demonstrate how it can be used to express a performance model for a three dimensional Fast Fourier Transform. We then demonstrate the composability and modularity of Aspen by importing and reusing the FFT model in a molecular dynamics model. We have also created a number of tools that allow scientists to balance application and system factors quickly and accurately.

  7. High temperature furnace modeling and performance verifications

    NASA Technical Reports Server (NTRS)

    Smith, James E., Jr.

    1988-01-01

    Analytical, numerical and experimental studies were performed on two classes of high temperature materials processing furnaces. The research concentrates on a commercially available high temperature furnace using zirconia as the heating element and an arc furnace based on a ST International tube welder. The zirconia furnace was delivered and work is progressing on schedule. The work on the arc furnace was initially stalled due to the unavailability of the NASA prototype, which is actively being tested aboard the KC-135 experimental aircraft. A proposal was written and funded to purchase an additional arc welder to alleviate this problem. The ST International weld head and power supply were received and testing will begin in early November. The first 6 months of the grant are covered.

  8. Student Attitudes towards and Use of ICT in Course Study, Work and Social Activity: A Technology Acceptance Model Approach

    ERIC Educational Resources Information Center

    Edmunds, Rob; Thorpe, Mary; Conole, Grainne

    2012-01-01

    The increasing use of information and communication technology (ICT) in higher education has been explored largely in relation to student experience of coursework and university life. Students' lives and experience beyond the university have been largely unexplored. Research into student experience of ICT used a validated model--the technology…

  9. The Adoption of Blended E-Learning Technology in Vietnam Using a Revision of the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Tran, Khanh Ngo Nhu

    2016-01-01

    This study examines factors that determine the attitudes of learners toward a blended e-learning system (BELS) using data collected by questionnaire from a sample of 396 students involved in a BELS environment in Vietnam. A theoretical model is derived from previous studies and is analyzed and developed using structural equation modeling…

  10. Predicting the Use of Paired Programming: Applying the Attitudes of Application Development Managers through the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Zecca, Mark S.

    2010-01-01

    Business managers who look for ways to cut costs face difficult questions about the efficiency and effectiveness of software engineering practices that are used to complete projects on time, on specification, and within budget (Johnson, 1995; Lindstrom & Jeffries, 2004). Theoretical models such as the Theory of Reasoned Action (TRA) have linked…

  11. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  12. Acceptability of GM foods among Pakistani consumers.

    PubMed

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-04-01

    In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers. PMID:27494790

  13. Performance modeling of launch vehicle imaging telescopes

    NASA Astrophysics Data System (ADS)

    Harvey, James E.; Krywonos, Andrey; Houston, Joseph B., Jr.

    2005-09-01

    The implementation plan for the "return-to-flight" of the space shuttle after the spectacular Columbia disaster upon re-entering the earth's atmosphere on February 1, 2003 included significant upgrades to the Ground Camera Ascent Imagery assets at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station. The accident was due to damage incurred when a piece if insulating foam debris from the external fuel tank struck the left wing during take-off. The Ground Camera Ascent Imagery Project encompasses a wide variety of launch vehicle tracking telescopes and cameras at the Eastern Range. Most of these launch vehicle imaging telescopes are manually tracked and fitted with video and 35 mm film cameras, and many of them are fixed-focus (i.e., focused at the hyperfocal distance for the duration of the launch). In this paper we describe a systems engineering analysis approach for obtaining performance predictions of these aging launch vehicle imaging telescopes. Recommendations for a continuing maintenance and refurbishment program that closes the loop around the KSC photo-interpreter are included.

  14. Establishment and metabolic analysis of a model microbial community for understanding trophic and electron accepting interactions of subsurface anaerobic environments

    PubMed Central

    2010-01-01

    Background Communities of microorganisms control the rates of key biogeochemical cycles, and are important for biotechnology, bioremediation, and industrial microbiological processes. For this reason, we constructed a model microbial community comprised of three species dependent on trophic interactions. The three species microbial community was comprised of Clostridium cellulolyticum, Desulfovibrio vulgaris Hildenborough, and Geobacter sulfurreducens and was grown under continuous culture conditions. Cellobiose served as the carbon and energy source for C. cellulolyticum, whereas D. vulgaris and G. sulfurreducens derived carbon and energy from the metabolic products of cellobiose fermentation and were provided with sulfate and fumarate respectively as electron acceptors. Results qPCR monitoring of the culture revealed C. cellulolyticum to be dominant as expected and confirmed the presence of D. vulgaris and G. sulfurreducens. Proposed metabolic modeling of carbon and electron flow of the three-species community indicated that the growth of C. cellulolyticum and D. vulgaris were electron donor limited whereas G. sulfurreducens was electron acceptor limited. Conclusions The results demonstrate that C. cellulolyticum, D. vulgaris, and G. sulfurreducens can be grown in coculture in a continuous culture system in which D. vulgaris and G. sulfurreducens are dependent upon the metabolic byproducts of C. cellulolyticum for nutrients. This represents a step towards developing a tractable model ecosystem comprised of members representing the functional groups of a trophic network. PMID:20497531

  15. Hydrologic Evaluation of Landfill Performance (HELP) Model: B (Set Includes, A- User's Guide for Version 3 w/disks, B-Engineering Documentation for Version 3

    EPA Science Inventory

    The Hydrologic Evaluation of Landfill Performance (HELP) computer program is a quasi-two-dimensional hydrologic model of water movement across, into, through and out of landfills. The model accepts weather, soil and design data. Landfill systems including various combinations o...

  16. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  17. Some Observations on Specifying Models of Group Performance.

    ERIC Educational Resources Information Center

    Goodman, Paul; And Others

    The purpose of this paper is to identify some critical dimensions in specifying a model of group performance. In the first section, the boundaries of the paper, e.g., work groups that produce some identifiable good or service, are discussed. In the second section some models of group performance are explored in order to illustrate theories of…

  18. The Effects of a Brief Acceptance-Based Behavioral Treatment Versus Traditional Cognitive-Behavioral Treatment for Public Speaking Anxiety: An Exploratory Trial Examining Differential Effects on Performance and Neurophysiology.

    PubMed

    Glassman, Lisa H; Forman, Evan M; Herbert, James D; Bradley, Lauren E; Foster, Elizabeth E; Izzetoglu, Meltem; Ruocco, Anthony C

    2016-09-01

    Individuals with public speaking anxiety (PSA) experience fear and avoidance that can cause extreme distress, impaired speaking performance, and associated problems in psychosocial functioning. Most extant interventions for PSA emphasize anxiety reduction rather than enhancing behavioral performance. We compared the efficacy of two brief cognitive-behavioral interventions, a traditional cognitive-behavior treatment (tCBT) and an acceptance-based behavior treatment (ABBT), on public speaking performance and anxiety in a clinical sample of persons with PSA. The effects of treatment on prefrontal brain activation were also examined. Participants (n = 21) were randomized to 90 min of an ABBT or a tCBT intervention. Assessments took place at pre- and post-treatment and included self-rated anxiety and observer-rated performance measures, a behavioral assessment, and prefrontal cortical activity measurements using functional near-infrared spectroscopy (fNIRS). Exploratory results indicated that participants in the ABBT condition experienced greater improvements in observer-rated performance relative to those in the tCBT condition, while those in the tCBT condition experienced greater reductions in subjective anxiety levels. Individuals in the ABBT condition also exhibited a trend toward greater treatment-related reductions in blood volume in the left dorsolateral prefrontal cortex relative to those who received tCBT. Overall, these findings preliminarily suggest that acceptance-based treatments may free more cognitive resources in comparison with tCBT, possibly resulting in greater improvements in objectively rated behavioral performances for ABBT interventions. PMID:26872958

  19. Analytic Ballistic Performance Model of Whipple Shields

    NASA Technical Reports Server (NTRS)

    Miller, J. E.; Bjorkman, M. D.; Christiansen, E. L.; Ryan, S. J.

    2015-01-01

    The dual-wall, Whipple shield is the shield of choice for lightweight, long-duration flight. The shield uses an initial sacrificial wall to initiate fragmentation and melt an impacting threat that expands over a void before hitting a subsequent shield wall of a critical component. The key parameters to this type of shield are the rear wall and its mass which stops the debris, as well as the minimum shock wave strength generated by the threat particle impact of the sacrificial wall and the amount of room that is available for expansion. Ensuring the shock wave strength is sufficiently high to achieve large scale fragmentation/melt of the threat particle enables the expansion of the threat and reduces the momentum flux of the debris on the rear wall. Three key factors in the shock wave strength achieved are the thickness of the sacrificial wall relative to the characteristic dimension of the impacting particle, the density and material cohesion contrast of the sacrificial wall relative to the threat particle and the impact speed. The mass of the rear wall and the sacrificial wall are desirable to minimize for launch costs making it important to have an understanding of the effects of density contrast and impact speed. An analytic model is developed here, to describe the influence of these three key factors. In addition this paper develops a description of a fourth key parameter related to fragmentation and its role in establishing the onset of projectile expansion.

  20. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  1. Students Perception towards the Implementation of Computer Graphics Technology in Class via Unified Theory of Acceptance and Use of Technology (UTAUT) Model

    NASA Astrophysics Data System (ADS)

    Binti Shamsuddin, Norsila

    Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.

  2. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  3. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    NASA Astrophysics Data System (ADS)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  4. Performance measurement and modeling of component applications in a high performance computing environment : a case study.

    SciTech Connect

    Armstrong, Robert C.; Ray, Jaideep; Malony, A.; Shende, Sameer; Trebon, Nicholas D.

    2003-11-01

    We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.

  5. Establishment and metabolic analysis of a model microbial community for understanding trophic and electron accepting interactions of subsurface anaerobic environments

    SciTech Connect

    Miller, Lance D; Mosher, Jennifer J; Venkateswaran, Amudhan; Yang, Zamin Koo; Palumbo, Anthony Vito; Phelps, Tommy Joe; Podar, Mircea; Schadt, Christopher Warren; Keller, Martin

    2010-01-01

    Communities of microorganisms control the rates of key biogeochemical cycles, and are important for biotechnology, bioremediation, and industrial microbiological processes. For this reason, we constructed a model microbial community comprised of three species dependent on trophic interactions. The three species microbial community was comprised of Clostridium cellulolyticum, Desulfovibrio vulgaris Hildenborough, and Geobacter sulfurreducens and was grown under continuous culture conditions. Cellobiose served as the carbon and energy source for C. cellulolyticum, whereas D. vulgaris and G. sulfurreducens derived carbon and energy from the metabolic products of cellobiose fermentation and were provided with sulfate and fumarate respectively as electron acceptors.

  6. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    SciTech Connect

    Azmy, Y.Y.; Barnett, D.A.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  7. Multitasking TORT under UNICOS: Parallel performance models and measurements

    SciTech Connect

    Barnett, A.; Azmy, Y.Y.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  8. Models for evaluating the performability of degradable computing systems

    NASA Technical Reports Server (NTRS)

    Wu, L. T.

    1982-01-01

    Recent advances in multiprocessor technology established the need for unified methods to evaluate computing systems performance and reliability. In response to this modeling need, a general modeling framework that permits the modeling, analysis and evaluation of degradable computing systems is considered. Within this framework, several user oriented performance variables are identified and shown to be proper generalizations of the traditional notions of system performance and reliability. Furthermore, a time varying version of the model is developed to generalize the traditional fault tree reliability evaluation methods of phased missions.

  9. Evaluating Organic Aerosol Model Performance: Impact of two Embedded Assumptions

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Giroux, E.; Roth, H.; Yin, D.

    2004-05-01

    Organic aerosols are important due to their abundance in the polluted lower atmosphere and their impact on human health and vegetation. However, modeling organic aerosols is a very challenging task because of the complexity of aerosol composition, structure, and formation processes. Assumptions and their associated uncertainties in both models and measurement data make model performance evaluation a truly demanding job. Although some assumptions are obvious, others are hidden and embedded, and can significantly impact modeling results, possibly even changing conclusions about model performance. This paper focuses on analyzing the impact of two embedded assumptions on evaluation of organic aerosol model performance. One assumption is about the enthalpy of vaporization widely used in various secondary organic aerosol (SOA) algorithms. The other is about the conversion factor used to obtain ambient organic aerosol concentrations from measured organic carbon. These two assumptions reflect uncertainties in the model and in the ambient measurement data, respectively. For illustration purposes, various choices of the assumed values are implemented in the evaluation process for an air quality model based on CMAQ (the Community Multiscale Air Quality Model). Model simulations are conducted for the Lower Fraser Valley covering Southwest British Columbia, Canada, and Northwest Washington, United States, for a historical pollution episode in 1993. To understand the impact of the assumed enthalpy of vaporization on modeling results, its impact on instantaneous organic aerosol yields (IAY) through partitioning coefficients is analysed first. The analysis shows that utilizing different enthalpy of vaporization values causes changes in the shapes of IAY curves and in the response of SOA formation capability of reactive organic gases to temperature variations. These changes are then carried into the air quality model and cause substantial changes in the organic aerosol modeling

  10. Breathing air trailer acceptance test procedure

    SciTech Connect

    Kostelnik, A.J.

    1994-09-14

    This Acceptance Test Procedure (ATP) will document compliance with the requirements of WHC-S-0251 Rev. 0 and ECNs 613530 and 606113. The equipment being tested is a Breathing Air Supply Trailer purchased as a Design and Fabrication procurement activity for use in the core sampling program. The ATP was written by the Seller and will be performed by the Seller with representatives of the Westinghouse Hanford Company witnessing the test at the Seller`s location. This test procedure is to verify that the American Bristol Industries, Inc., Model 5014-0001 low pressure Mobile Breathing Air Trailer, meets or exceeds the requirements of the Westinghouse Hanford specification.

  11. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    SciTech Connect

    Tidball, Rick; Bluestein, Joel; Rodriguez, Nick; Knoke, Stu

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  12. A strategic management model for evaluation of health, safety and environmental performance.

    PubMed

    Abbaspour, Majid; Toutounchian, Solmaz; Roayaei, Emad; Nassiri, Parvin

    2012-05-01

    Strategic health, safety, and environmental management system (HSE-MS) involves systematic and cooperative planning in each phase of the lifecycle of a project to ensure that interaction among the industry group, client, contractor, stakeholder, and host community exists with the highest level of health, safety, and environmental standard performances. Therefore, it seems necessary to assess the HSE-MS performance of contractor(s) by a comparative strategic management model with the aim of continuous improvement. The present Strategic Management Model (SMM) has been illustrated by a case study and the results show that the model is a suitable management tool for decision making in a contract environment, especially in oil and gas fields and based on accepted international standards within the framework of management deming cycle. To develop this model, a data bank has been created, which includes the statistical data calculated by converting the HSE performance qualitative data into quantitative values. Based on this fact, the structure of the model has been formed by defining HSE performance indicators according to the HSE-MS model. Therefore, 178 indicators have been selected which have been grouped into four attributes. Model output provides quantitative measures of HSE-MS performance as a percentage of an ideal level with maximum possible score for each attribute. Defining the strengths and weaknesses of the contractor(s) is another capability of this model. On the other hand, this model provides a ranking that could be used as the basis for decision making at the contractors' pre-qualification phase or during the execution of the project. PMID:21739281

  13. Reactive puff model SCICHEM: Model enhancements and performance studies

    NASA Astrophysics Data System (ADS)

    Chowdhury, B.; Karamchandani, P. K.; Sykes, R. I.; Henn, D. S.; Knipping, E.

    2015-09-01

    The SCICHEM model incorporates complete gas phase, aqueous and aerosol phase chemistry within a state-of-the-science Gaussian puff model SCIPUFF (Second-order Closure Integrated Puff). The model is a valuable tool that can be used to calculate the impacts of a single source or a small number of sources on downwind ozone and PM2.5. The model has flexible data requirements: it can be run with routine surface and upper air observations or with prognostic meteorological model outputs and source emissions are specified in a simple text format. This paper describes significant advances to the dispersion and chemistry components of the model in the latest release, SCICHEM 3.0. Some of the major advancements include modeling of skewed turbulence for convective boundary layer and updated chemistry schemes (CB05 gas phase chemical mechanism; AERO5 aerosol and aqueous modules). The results from SCICHEM 3.0 are compared with observations from a tracer study as well as aircraft measurements of reactive species in power plant plumes from two field studies. The results with the tracer experiment (Copenhagen study) show that the incorporation of skewed turbulence improves the calculation of tracer dispersion and transport. The comparisons with the Cumberland and Dolet Hills power plume measurements show good correlation between the observed and predicted concentrations of reactive gaseous species at most downwind distances from the source.

  14. Numerical Modeling of Pulse Detonation Rocket Engine Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    2003-01-01

    This paper presents viewgraphs on the numerical modeling of pulse detonation rocket engines (PDRE), with an emphasis on the Gasdynamics and performance analysis of these engines. The topics include: 1) Performance Analysis of PDREs; 2) Simplified PDRE Cycle; 3) Comparison of PDRE and Steady-State Rocket Engines (SSRE) Performance; 4) Numerical Modeling of Quasi 1-D Rocket Flows; 5) Specific PDRE Geometries Studied; 6) Time-Accurate Thrust Calculations; 7) PDRE Performance (Geometries A B C and D); 8) PDRE Blowdown Gasdynamics (Geom. A B C and D); 9) PDRE Geometry Performance Comparison; 10) PDRE Blowdown Time (Geom. A B C and D); 11) Specific SSRE Geometry Studied; 12) Effect of F-R Chemistry on SSRE Performance; 13) PDRE/SSRE Performance Comparison; 14) PDRE Performance Study; 15) Grid Resolution Study; and 16) Effect of F-R Chemistry on SSRE Exit Species Mole Fractions.

  15. Factors Affecting Acceptance of Smartphone Application for Management of Obesity

    PubMed Central

    Jeon, Eunjoo

    2015-01-01

    Objectives The factors affecting the acceptance of mobile obesity-management applications (apps) by the public were analyzed using a mobile healthcare system (MHS) technology acceptance model (TAM). Methods The subjects who participated in this study were Android smartphone users who had an intent to manage their weight. They used the obesity-management app for two weeks, and then completed an 18-item survey designed to determine the factors influencing the acceptance of the app. Three questions were asked pertaining to each of the following six factors: compatibility, self-efficacy, technical support and training, perceived usefulness, perceived ease of use, and behavior regarding intention to use. Cronbach's alpha was used to assess the reliability of the scales. Pathway analysis was also performed to evaluate the MHS acceptance model. Results A total of 94 subjects participated in this study. The results indicate that compatibility, perceived usefulness, and perceived ease of use significantly affected the behavioral intention to use the mobile obesity-management app. Technical support and training also significantly affected the perceived ease of use; however, the hypotheses that self-efficacy affects perceived usefulness and perceived ease of use were not supported in this study. Conclusions This is the first attempt to analyze the factors influencing mobile obesity-management app acceptance using a TAM. Further studies should cover not only obesity but also other chronic diseases and should analyze the factors affecting the acceptance of apps among healthcare consumers in general. PMID:25995959

  16. Final Report: Performance Modeling Activities in PERC2

    SciTech Connect

    Allan Snavely

    2007-02-25

    Progress in Performance Modeling for PERC2 resulted in: • Automated modeling tools that are robust, able to characterize large applications running at scale while simultaneously simulating the memory hierarchies of mul-tiple machines in parallel. • Porting of the requisite tracer tools to multiple platforms. • Improved performance models by using higher resolution memory models that ever before. • Adding control-flow and data dependency analysis to the tracers used in perform-ance tools. • Exploring and developing several new modeling methodologies. • Using modeling tools to develop performance models for strategic codes. • Application of modeling methodology to make a large number of “blind” per-formance predictions on certain mission partner applications, targeting most cur-rently available system architectures. • Error analysis to correct some systematic biases encountered as part of the large-scale blind prediction exercises. • Addition of instrumentation capabilities for communication libraries other than MPI. • Dissemination the tools and modeling methods to several mission partners, in-cluding DoD HPCMO and two DARPA HPCS vendors (Cray and IBM), as well as to the wider HPC community via a series of tutorials.

  17. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  18. COST AND PERFORMANCE MODELS FOR ELECTROSTATICALLY STIMULATED FABRIC FILTRATION

    EPA Science Inventory

    The report gives results of a survey of the literature on performance models for pulse-cleaned fabric filters. Each model is evaluated for its ability to predict average pressure drop from pilot plant data. The best model is chosen and used, in conjunction with pressure drop redu...

  19. Business Models for Training and Performance Improvement Departments

    ERIC Educational Resources Information Center

    Carliner, Saul

    2004-01-01

    Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…

  20. Public Education Resources and Pupil Performance Models: A Summary Report.

    ERIC Educational Resources Information Center

    Spottheim, David; And Others

    This is a summary of a report which presents three models quantifying the relationships between educational means (resources) and ends (pupil achievements) to analyze resource allocation problems within school districts: (1) the Pupil Performance Model; (2) the Goal Programming Model; and (3) the Operational Structure of a School and Pupil…

  1. Research on web performance optimization principles and models

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2013-03-01

    The Internet high speed development, causes Web the optimized question to be getting more and more prominent, therefore the Web performance optimizes into inevitably. the first principle of Web Performance Optimization is to understand, to know that income will have to pay, and return is diminishing; Simultaneously the probability will decrease Web the performance, and will start from the highest level to optimize obtained biggest. Web Technical models to improve the performance are: sharing costs, high-speed caching, profiles, parallel processing, simplified treatment. Based on this study, given the crucial Web performance optimization recommendations, which improve the performance of Web usage, accelerate the efficient use of Internet has an important significance.

  2. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  3. Undergraduate technical skills training guided by student tutors – Analysis of tutors' attitudes, tutees' acceptance and learning progress in an innovative teaching model

    PubMed Central

    Weyrich, Peter; Schrauth, Markus; Kraus, Bernd; Habermehl, Daniel; Netzhammer, Nicolai; Zipfel, Stephan; Jünger, Jana; Riessen, Reimer; Nikendei, Christoph

    2008-01-01

    Background Skills labs provide a sheltered learning environment. As close supervision and individual feedback were proven to be important in ensuring effective skills training, we implemented a cross-year peer tutor system in our skills lab of internal medicine that allowed intense training sessions with small learning groups (3–4 students) taught by one student tutor. Methods The expectations, experiences and criticisms of peer tutors regarding the tutor system for undergraduate skills lab training were investigated in the context of a focus group. In addition, tutees' acceptance of this learning model and of their student tutors was evaluated by means of a pre/post web-based survey. Results 14 voluntary senior students were intensely prepared by consultants for their peer tutor activity. 127 students participated in the project, 66.9% of which responded to the web-based survey (23 topics with help of 6-point Likert scale + free comments). Acceptance was very high (5.69 ± 0.07, mean ± SEM), and self-confidence ratings increased significantly after the intervention for each of the trained skills (average 1.96 ± 0.08, all p < 0.002). Tutors received high global ratings (5.50 ± 0.07) and very positive anonymous individual feedback from participants. 82% of tutees considered the peer teaching model to be sufficient, and a mere 1% expressed the wish for skills training to be provided by faculty staff only. Focus group analyses with tutors revealed 18 different topics, including profit in personal knowledge and personal satisfaction through teaching activities. The ratio of 1:4 tutor/tutees was regarded to be very beneficial for effective feedback, and the personalized online evaluation by tutees to be a strong motivator and helpful for further improvements. The tutors ascribed great importance to the continuous availability of a contact doctor in case of uncertainties. Conclusion This study demonstrates that peer teaching in undergraduate technical clinical

  4. Acceptance and commitment therapy (ACT): the foundation of the therapeutic model and an overview of its contribution to the treatment of patients with chronic physical diseases.

    PubMed

    Prevedini, Anna Bianca; Presti, Giovambattista; Rabitti, Elisa; Miselli, Giovanni; Moderato, Paolo

    2011-01-01

    Nowadays, treatment of chronic illnesses, such as stroke, cancer, chronic heart and respiratory diseases, osteoarthritis, diabetes, and so forth, account for the largest part of expenses in western countries national health systems. Moreover, these diseases are by far the leading causes of mortality in the world, representing 60% of all deaths. Any treatment aimed at targeting them might engage an individual for a large portion of his/her life so that personal and environmental factors can play a crucial role in modulating the person's quality of life and functioning, on top of any medical cure. Anxiety, depression, and distress for examples are not rare in patients with chronic diseases. Therefore, Cognitive and Behavior Therapy research has largely contributed in the last decades in identifying and programming interventions on such aspects as real and perceived social and family support, coping abilities, locus of control, self-efficacy that might help patients living with their chronic disease. More recently, third generation Cognitive-Behavior-Therapies, such as Dialectical Behavioral Therapy (DBT), Mindfulness Based Cognitive Therapy (MBCT), Functional Analytic Psychotherapy (FAP) and Acceptance, and Commitment Therapy (ACT) focused their attention and research efforts on developing intervention models targeting the needs of patients with a chronic disease. This paper has three aims. First is to briefly introduce ACT epistemological (Functional Contextualism) and theoretical (Relational Frame Theory) foundations as a stand point for understanding the peculiarity of ACT as a modern form of Clinical Behavior Analysis. The second aim is to introduce ACT clinical model and its six core processes (acceptance, defusion, present moment, self as a context, values and committed action) as both accountable, in their continuum, for psychological flexibility and inflexibility. Third, to present a brief overview of studies and outcomes of ACT intervention protocols and

  5. ATLAS ACCEPTANCE TEST

    SciTech Connect

    J.C. COCHRANE; J.V. PARKER; ET AL

    2001-06-01

    The acceptance test program for Atlas, a 23 MJ pulsed power facility for use in the Los Alamos High Energy Density Hydrodynamics program, has been completed. Completion of this program officially releases Atlas from the construction phase and readies it for experiments. Details of the acceptance test program results and of machine capabilities for experiments will be presented.

  6. Performance model for grid-connected photovoltaic inverters.

    SciTech Connect

    Boyson, William Earl; Galbraith, Gary M.; King, David L.; Gonzalez, Sigifredo

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurements conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.

  7. An Evaluation of Controller and Pilot Performance, Workload and Acceptability under a NextGen Concept for Dynamic Weather Adapted Arrival Routing

    NASA Technical Reports Server (NTRS)

    Johnson, Walter W.; Lachter, Joel; Brandt, Summer; Koteskey, Robert; Dao, Arik-Quang; Kraut, Josh; Ligda, Sarah; Battiste, Vernol

    2012-01-01

    In todays terminal operations, controller workload increases and throughput decreases when fixed standard terminal arrival routes (STARs) are impacted by storms. To circumvent this operational constraint, Prete, Krozel, Mitchell, Kim and Zou (2008) proposed to use automation to dynamically adapt arrival and departure routing based on weather predictions. The present study examined this proposal in the context of a NextGen trajectory-based operation concept, focusing on the acceptability and its effect on the controllers ability to manage traffic flows. Six controllers and twelve transport pilots participated in a human-in-the-loop simulation of arrival operations into Louisville International Airport with interval management requirements. Three types of routing structures were used: Static STARs (similar to current routing, which require the trajectories of individual aircraft to be modified to avoid the weather), Dynamic routing (automated adaptive routing around weather), and Dynamic Adjusted routing (automated adaptive routing around weather with aircraft entry time adjusted to account for differences in route length). Spacing Responsibility, whether responsibility for interval management resided with the controllers (as today), or resided with the pilot (who used a flight deck based automated spacing algorithm), was also manipulated. Dynamic routing as a whole was rated superior to static routing, especially by pilots, both in terms of workload reduction and flight path safety. A downside of using dynamic routing was that the paths flown in the dynamic conditions tended to be somewhat longer than the paths flown in the static condition.

  8. SUMO, System performance assessment for a high-level nuclear waste repository: Mathematical models

    SciTech Connect

    Eslinger, P.W.; Miley, T.B.; Engel, D.W.; Chamberlain, P.J. II

    1992-09-01

    Following completion of the preliminary risk assessment of the potential Yucca Mountain Site by Pacific Northwest Laboratory (PNL) in 1988, the Office of Civilian Radioactive Waste Management (OCRWM) of the US Department of Energy (DOE) requested the Performance Assessment Scientific Support (PASS) Program at PNL to develop an integrated system model and computer code that provides performance and risk assessment analysis capabilities for a potential high-level nuclear waste repository. The system model that has been developed addresses the cumulative radionuclide release criteria established by the US Environmental Protection Agency (EPA) and estimates population risks in terms of dose to humans. The system model embodied in the SUMO (System Unsaturated Model) code will also allow benchmarking of other models being developed for the Yucca Mountain Project. The system model has three natural divisions: (1) source term, (2) far-field transport, and (3) dose to humans. This document gives a detailed description of the mathematics of each of these three divisions. Each of the governing equations employed is based on modeling assumptions that are widely accepted within the scientific community.

  9. An Approach for Improving Prediction in River System Models Using Bayesian Probabilities of Parameter Performance

    NASA Astrophysics Data System (ADS)

    Kim, S. S. H.; Hughes, J. D.; Chen, J.; Dutta, D.; Vaze, J.

    2014-12-01

    Achieving predictive success is a major challenge in hydrological modelling. Predictive metrics indicate whether models and parameters are appropriate for impact assessment, design, planning and management, forecasting and underpinning policy. It is often found that very different parameter sets and model structures are equally acceptable system representations (commonly described as equifinality). Furthermore, parameters that produce the best goodness of fit during a calibration period may often yield poor results outside of that period. A calibration method is presented that uses a recursive Bayesian filter to estimate the probability of consistent performance of parameter sets in different sub-periods. The result is a probability distribution for each specified performance interval. This generic method utilises more information within time-series data than what is typically used for calibrations, and could be adopted for different types of time-series modelling applications. Where conventional calibration methods implicitly identify the best performing parameterisations on average, the new method looks at the consistency of performance during sub-periods. The proposed calibration method, therefore, can be used to avoid heavy weighting toward rare periods of good agreement. The method is trialled in a conceptual river system model called the Australian Water Resources Assessments River (AWRA-R) model in the Murray-Darling Basin, Australia. The new method is tested via cross-validation and results are compared to a traditional split-sample calibration/validation to evaluate the new technique's ability to predict daily streamflow. The results showed that the new calibration method could produce parameterisations that performed better in validation periods than optimum calibration parameter sets. The method shows ability to improve on predictive performance and provide more realistic flux terms compared to traditional split-sample calibration methods.

  10. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  11. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  12. Simulation study for model performance of multiresponse semiparametric regression

    NASA Astrophysics Data System (ADS)

    Wibowo, Wahyu; Haryatmi, Sri; Budiantara, I. Nyoman

    2015-12-01

    The objective of this paper is to evaluate the performance of multiresponse semiparametric regression model based on both of the function types and sample sizes. In general, multiresponse semiparametric regression model consists of parametric and nonparametric functions. This paper focuses on both linear and quadratic functions for parametric components and spline function for nonparametric component. Moreover, this model could also be seen as a spline semiparametric seemingly unrelated regression model. Simulation study is conducted by evaluating three combinations of parametric and nonparametric components, i.e. linear-trigonometric, quadratic-exponential, and multiple linear-polynomial functions respectively. Two criterias are used for assessing the model performance, i.e. R-square and Mean Square Error (MSE). The results show that both of the function types and sample sizes have significantly influenced to the model performance. In addition, this multiresponse semiparametric regression model yields the best performance at the small sample size and combination between multiple linear and polynomial functions as parametric and nonparametric components respectively. Moreover, the model performances at the big sample size tend to be similar for any combination of parametric and nonparametric components.

  13. Models used to assess the performance of photovoltaic systems.

    SciTech Connect

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  14. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  15. Solid rocket booster performance evaluation model. Volume 4: Program listing

    NASA Technical Reports Server (NTRS)

    1974-01-01

    All subprograms or routines associated with the solid rocket booster performance evaluation model are indexed in this computer listing. An alphanumeric list of each routine in the index is provided in a table of contents.

  16. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  17. 48 CFR 452.246-70 - Inspection and Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Inspection and Acceptance... Inspection and Acceptance. As prescribed in 446.370, insert the following clause: Inspection and Acceptance... acceptance will be performed at: ___.* (End of clause) * Contracting Officer shall insert...

  18. Custom component generation in the night vision integrated performance model

    NASA Astrophysics Data System (ADS)

    Teaney, Brian P.; Haefner, David P.; Burks, Stephen D.

    2015-05-01

    The latest version of the U.S. Army imager performance model, the Night Vision Integrated Performance Model (NV-IPM), is now contained within a single, system engineering oriented design environment. This new model interface allows sensor systems to be represented using modular, reusable components. A new feature, added in version 1.3 of the NV-IPM, allows users to create custom components which can be incorporated into modeled systems. The ability to modify existing component definitions and create entirely new components in the model greatly enhances the extensibility of the model architecture. In this paper we will discuss the structure of the custom component and parameter generators and provide several examples where this feature can be used to easily create new and unique component definitions within the model.

  19. A performance model of the OSI communication architecture

    NASA Astrophysics Data System (ADS)

    Kritzinger, P. S.

    1986-06-01

    An analytical model aiming at predicting the performance of software implementations which would be built according to the OSI basic reference model is proposed. The model uses the peer protocol standard of a layer as the reference description of an implementation of that layer. The model is basically a closed multiclass multichain queueing network with a processor-sharing center, modeling process contention at the processor, and a delay center, modeling times spent waiting for responses from the corresponding peer processes. Each individual transition of the protocol constitutes a different class and each layer of the architecture forms a closed chain. Performance statistics include queue lengths and response times at the processor as a function of processor speed and the number of open connections. It is shown how to reduce the model should the protocol state space become very large. Numerical results based upon the derived formulas are given.

  20. Attitudinal and Intentional Acceptance of Domestic Robots by Younger and Older Adults

    PubMed Central

    Ezer, Neta; Fisk, Arthur D.; Rogers, Wendy A.

    2014-01-01

    A study was conducted to examine the expectations that younger and older individuals have about domestic robots and how these expectations relate to robot acceptance. In a questionnaire participants were asked to imagine a robot in their home and to indicate how much items representing technology, social partner, and teammate acceptance matched their robot. There were additional questions about how useful and easy to use they thought their robot would be. The dependent variables were attitudinal and intentional acceptance. The analysis of the responses of 117 older adults (aged 65–86) and 60 younger adults (aged 18–25) indicated that individuals thought of robots foremost as performance-directed machines, less so as social devices, and least as unproductive entities. The robustness of the Technology Acceptance Model to robot acceptance was supported. Technology experience accounted for the variance in robot acceptance due to age. PMID:25584365

  1. Visual performance modeling in the human operator simulator

    NASA Technical Reports Server (NTRS)

    Strieb, M. I.

    1979-01-01

    A brief description of the history of the development of the human operator simulator (HOS) model is presented. Features of the HOS micromodels that impact on the obtainment of visual performance data are discussed along with preliminary details on a HOS pilot model designed to predict the results of visual performance workload data obtained through oculometer studies on pilots in real and simulated approaches and landings.

  2. Do bioclimate variables improve performance of climate envelope models?

    USGS Publications Warehouse

    Watling, James I.; Romañach, Stephanie S.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Pearlstine, Leonard G.; Mazzotti, Frank J.

    2012-01-01

    Climate envelope models are widely used to forecast potential effects of climate change on species distributions. A key issue in climate envelope modeling is the selection of predictor variables that most directly influence species. To determine whether model performance and spatial predictions were related to the selection of predictor variables, we compared models using bioclimate variables with models constructed from monthly climate data for twelve terrestrial vertebrate species in the southeastern USA using two different algorithms (random forests or generalized linear models), and two model selection techniques (using uncorrelated predictors or a subset of user-defined biologically relevant predictor variables). There were no differences in performance between models created with bioclimate or monthly variables, but one metric of model performance was significantly greater using the random forest algorithm compared with generalized linear models. Spatial predictions between maps using bioclimate and monthly variables were very consistent using the random forest algorithm with uncorrelated predictors, whereas we observed greater variability in predictions using generalized linear models.

  3. Performance modeling of feature-based classification in SAR imagery

    NASA Astrophysics Data System (ADS)

    Boshra, Michael; Bhanu, Bir

    1998-09-01

    We present a novel method for modeling the performance of a vote-based approach for target classification in SAR imagery. In this approach, the geometric locations of the scattering centers are used to represent 2D model views of a 3D target for a specific sensor under a given viewing condition (azimuth, depression and squint angles). Performance of such an approach is modeled in the presence of data uncertainty, occlusion, and clutter. The proposed method captures the structural similarity between model views, which plays an important role in determining the classification performance. In particular, performance would improve if the model views are dissimilar and vice versa. The method consists of the following steps. In the first step, given a bound on data uncertainty, model similarity is determined by finding feature correspondence in the space of relative translations between each pair of model views. In the second step, statistical analysis is carried out in the vote, occlusion and clutter space, in order to determine the probability of misclassifying each model view. In the third step, the misclassification probability is averaged for all model views to estimate the probability-of-correct- identification (PCI) plot as a function of occlusion and clutter rates. Validity of the method is demonstrated by comparing predicted PCI plots with ones that are obtained experimentally. Results are presented using both XPATCH and MSTAR SAR data.

  4. The Social Responsibility Performance Outcomes Model: Building Socially Responsible Companies through Performance Improvement Outcomes.

    ERIC Educational Resources Information Center

    Hatcher, Tim

    2000-01-01

    Considers the role of performance improvement professionals and human resources development professionals in helping organizations realize the ethical and financial power of corporate social responsibility. Explains the social responsibility performance outcomes model, which incorporates the concepts of societal needs and outcomes. (LRW)

  5. Hydrologic and water quality models: Performance measures and evaluation criteria

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Performance measures and corresponding criteria constitute an important aspect of calibration and validation of any hydrological and water quality (H/WQ) model. As new and improved methods and information are developed, it is essential that performance measures and criteria be updated. Therefore, th...

  6. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  7. A Model Linking the Learning Organization and Performance Job Satisfaction

    ERIC Educational Resources Information Center

    Dirani, Khalil M.

    2006-01-01

    The underlying theories of learning and performance are quite complex. This paper proposes a model that links the learning organization theory as a process with job satisfaction as a performance theory outcome. The literature reviewed considered three process levels of learning within the learning organization and three outcome levels of job…

  8. Rehearsal and Hamilton's "Ingredients Model" of Theatrical Performance

    ERIC Educational Resources Information Center

    Davies, David

    2009-01-01

    One among the many virtues of James Hamilton's book, "The Art of Theater," is that it challenges the hegemony of the classical paradigm in the performing arts by questioning its applicability to theatrical performances. He argues instead for an "ingredients model" of the relationship between a literary script and a theatrical work. According to…

  9. A Composite Model for Employees' Performance Appraisal and Improvement

    ERIC Educational Resources Information Center

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  10. Designing Electronic Performance Support Systems: Models and Instructional Strategies Employed

    ERIC Educational Resources Information Center

    Nekvinda, Christopher D.

    2011-01-01

    The purpose of this qualitative study was to determine whether instructional designers and performance technologists utilize instructional design models when designing and developing electronic performance support systems (EPSS). The study also explored if these same designers were utilizing instructional strategies within their EPSS to support…

  11. From models to performance assessment: the conceptualization problem.

    PubMed

    Bredehoeft, John D

    2003-01-01

    Today, models are ubiquitous tools for ground water analyses. The intent of this paper is to explore philosophically the role of the conceptual model in analysis. Selection of the appropriate conceptual model is an a priori decision by the analyst. Calibration is an integral part of the modeling process. Unfortunately a wrong or incomplete conceptual model can often be adequately calibrated; good calibration of a model does not ensure a correct conceptual model. Petroleum engineers have another term for calibration; they refer to it as history matching. A caveat to the idea of history matching is that we can make a prediction with some confidence equal to the period of the history match. In other words, if we have matched a 10-year history, we can predict for 10 years with reasonable confidence; beyond 10 years the confidence in the prediction diminishes rapidly. The same rule of thumb applies to ground water model analyses. Nuclear waste disposal poses a difficult problem because the time horizon, 1000 years or longer, is well beyond the possibility of the history match (or period of calibration) in the traditional analysis. Nonetheless, numerical models appear to be the tool of choice for analyzing the safety of waste facilities. Models have a well-recognized inherent uncertainty. Performance assessment, the technique for assessing the safety of nuclear waste facilities, involves an ensemble of cascading models. Performance assessment with its ensemble of models multiplies the inherent uncertainty of the single model. The closer we can approach the idea of a long history with which to match the models, even models of nuclear waste facilities, the more confidence we will have in the analysis (and the models, including performance assessment). This thesis argues for prolonged periods of observation (perhaps as long as 300 to 1000 years) before a nuclear waste facility is finally closed. PMID:13678111

  12. Newbery Medal Acceptance.

    ERIC Educational Resources Information Center

    Freedman, Russell

    1988-01-01

    Presents the Newbery Medal acceptance speech of Russell Freedman, writer of children's nonfiction. Discusses the place of nonfiction in the world of children's literature, the evolution of children's biographies, and the author's work on "Lincoln." (ARH)

  13. Newbery Medal Acceptance.

    ERIC Educational Resources Information Center

    Cleary, Beverly

    1984-01-01

    Reprints the text of Ms. Cleary's Newbery medal acceptance speech in which she gives personal history concerning her development as a writer and her response to the letters she receives from children. (CRH)

  14. Caldecott Medal Acceptance.

    ERIC Educational Resources Information Center

    Provensen, Alice; Provensen, Martin

    1984-01-01

    Reprints the text of the Provensens' Caldecott medal acceptance speech in which they describe their early interest in libraries and literature, the collaborative aspect of their work, and their current interest in aviation. (CRH)

  15. WRF model performance analysis for a suite of simulation design

    NASA Astrophysics Data System (ADS)

    Mohan, Manju; Sati, Ankur Prabhat

    2016-03-01

    At present scientists are successfully using Numerical Weather Prediction (NWP) models to achieve a reliable forecast. Nested domains are preferred by the modelling community with varying grid ratios having wider applications. The impact of the nesting grid ratio (NGR) on the model performance needs systematic analysis and explored in the present study. The usage of WRF is mostly as a mesoscale model in simulating either extreme events or events of smaller duration shown with statistical model evaluation for the correspondingly similar and short period of time. Thus, influence of the simulation period on model performance has been examined for key meteorological parameters. Several works done earlier on episodes involve model implementation for longer duration and for that single simulation is performed often for a continuous stretch. This study scrutinizes the influence on model performance due to one single simulation versus several smaller simulations for the same duration; essentially splitting the run-time. In the present study, the surface wind (i.e., winds at 10 meters), temperature and Relative humidity at 2 meters as obtained from model simulations are compared with the Observations. The sensitivity study of nesting grid ratio, continuous versus smaller split simulations and realistic simulation period is done in the present study. It is found that there is no statistically significant difference in the simulated results on changing the nesting grid ratio while the smaller time split schemes (2 days and 4 days schemes on comparison with 8 days and 16 days continuous run) improve the results significantly. The impact of increasing number of observations from different sites on model performance is also scrutinised. Furthermore, conceptual framework is provided for Optimum time period for simulations to have confidence in statistical model evaluation.

  16. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  17. Performance measures and criteria for hydrologic and water quality models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Performance measures and criteria are essential for model calibration and validation. This presentation will include a summary of one of the papers that will be included in the 2014 Hydrologic and Water Quality Model Calibration & Validation Guidelines Special Collection of the ASABE Transactions. T...

  18. Terahertz imaging system performance model for concealed-weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Jacobs, Eddie L.; Moyer, Steven K.; Halford, Carl E.; Griffin, Steven T.; De Lucia, Frank C.; Petkie, Douglas T.; Franck, Charmaine C.

    2008-03-01

    The U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) and the U.S. Army Research Laboratory have developed a terahertz (THz) -band imaging system performance model for detection and identification of concealed weaponry. The MATLAB-based model accounts for the effects of all critical sensor and display components and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination. The model is based on recent U.S. Army NVESD sensor performance modeling technology that couples system design parameters to observer-sensor field performance by using the acquire methodology for weapon identification performance predictions. This THz model has been developed in support of the Defense Advanced Research Project Agencies' Terahertz Imaging Focal-Plane Technology (TIFT) program and is currently being used to guide the design and development of a 0.650 THz active-passive imaging system. This paper will describe the THz model in detail, provide and discuss initial modeling results for a prototype THz imaging system, and outline plans to calibrate and validate the model through human perception testing.

  19. Terahertz imaging system performance model for concealed-weapon identification.

    PubMed

    Murrill, Steven R; Jacobs, Eddie L; Moyer, Steven K; Halford, Carl E; Griffin, Steven T; De Lucia, Frank C; Petkie, Douglas T; Franck, Charmaine C

    2008-03-20

    The U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) and the U.S. Army Research Laboratory have developed a terahertz (THz) -band imaging system performance model for detection and identification of concealed weaponry. The MATLAB-based model accounts for the effects of all critical sensor and display components and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination. The model is based on recent U.S. Army NVESD sensor performance modeling technology that couples system design parameters to observer-sensor field performance by using the acquire methodology for weapon identification performance predictions. This THz model has been developed in support of the Defense Advanced Research Project Agencies' Terahertz Imaging Focal-Plane Technology (TIFT) program and is currently being used to guide the design and development of a 0.650 THz active-passive imaging system. This paper will describe the THz model in detail, provide and discuss initial modeling results for a prototype THz imaging system, and outline plans to calibrate and validate the model through human perception testing. PMID:18709076

  20. Terahertz imaging system performance model for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Jacobs, Eddie L.; Moyer, Steven K.; Halford, Carl E.; Griffin, Steven T.; De Lucia, Frank C.; Petkie, Douglas T.; Franck, Charmaine C.

    2005-11-01

    The U.S. Army Night Vision and Electronic Sensors Directorate and the U.S. Army Research Laboratory have developed a terahertz-band imaging system performance model for detection and identification of concealed weaponry. The MATLAB-based model accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination. The model is based on recent U.S. Army NVESD sensor performance models that couple system design parameters to observer-sensor field performance using the acquire methodology for weapon identification performance predictions. This THz model has been developed in support of the Defense Advanced Research Project Agencies' Terahertz Imaging Focal-Plane-Array Technology (TIFT) program and is presently being used to guide the design and development of a 0.650 THz active/passive imaging system. This paper will describe the THz model in detail, provide and discuss initial modeling results for a prototype THz imaging system, and outline plans to validate and calibrate the model through human perception testing.

  1. Faculty Performance Evaluation: The CIPP-SAPS Model.

    ERIC Educational Resources Information Center

    Mitcham, Maralynne

    1981-01-01

    The issues of faculty performance evaluation for allied health professionals are addressed. Daniel Stufflebeam's CIPP (content-imput-process-product) model is introduced and its development into a CIPP-SAPS (self-administrative-peer- student) model is pursued. (Author/CT)

  2. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  3. Null Objects in Second Language Acquisition: Grammatical vs. Performance Models

    ERIC Educational Resources Information Center

    Zyzik, Eve C.

    2008-01-01

    Null direct objects provide a favourable testing ground for grammatical and performance models of argument omission. This article examines both types of models in order to determine which gives a more plausible account of the second language data. The data were collected from second language (L2) learners of Spanish by means of four oral…

  4. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  5. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  6. An analytical model of the HINT performance metric

    SciTech Connect

    Snell, Q.O.; Gustafson, J.L.

    1996-10-01

    The HINT benchmark was developed to provide a broad-spectrum metric for computers and to measure performance over the full range of memory sizes and time scales. We have extended our understanding of why HINT performance curves look the way they do and can now predict the curves using an analytical model based on simple hardware specifications as input parameters. Conversely, by fitting the experimental curves with the analytical model, hardware specifications such as memory performance can be inferred to provide insight into the nature of a given computer system.

  7. Performance Models for the Spike Banded Linear System Solver

    DOE PAGESBeta

    Manguoglu, Murat; Saied, Faisal; Sameh, Ahmed; Grama, Ananth

    2011-01-01

    With availability of large-scale parallel platforms comprised of tens-of-thousands of processors and beyond, there is significant impetus for the development of scalable parallel sparse linear system solvers and preconditioners. An integral part of this design process is the development of performance models capable of predicting performance and providing accurate cost models for the solvers and preconditioners. There has been some work in the past on characterizing performance of the iterative solvers themselves. In this paper, we investigate the problem of characterizing performance and scalability of banded preconditioners. Recent work has demonstrated the superior convergence properties and robustness of banded preconditioners,more » compared to state-of-the-art ILU family of preconditioners as well as algebraic multigrid preconditioners. Furthermore, when used in conjunction with efficient banded solvers, banded preconditioners are capable of significantly faster time-to-solution. Our banded solver, the Truncated Spike algorithm is specifically designed for parallel performance and tolerance to deep memory hierarchies. Its regular structure is also highly amenable to accurate performance characterization. Using these characteristics, we derive the following results in this paper: (i) we develop parallel formulations of the Truncated Spike solver, (ii) we develop a highly accurate pseudo-analytical parallel performance model for our solver, (iii) we show excellent predication capabilities of our model – based on which we argue the high scalability of our solver. Our pseudo-analytical performance model is based on analytical performance characterization of each phase of our solver. These analytical models are then parameterized using actual runtime information on target platforms. An important consequence of our performance models is that they reveal underlying performance bottlenecks in both serial and parallel formulations. All of our results are validated

  8. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  9. Modeling observed animal performance using the Weibull distribution.

    PubMed

    Hagey, Travis J; Puthoff, Jonathan B; Crandell, Kristen E; Autumn, Kellar; Harmon, Luke J

    2016-06-01

    To understand how organisms adapt, researchers must link performance and microhabitat. However, measuring performance, especially maximum performance, can sometimes be difficult. Here, we describe an improvement over previous techniques that only consider the largest observed values as maxima. Instead, we model expected performance observations via the Weibull distribution, a statistical approach that reduces the impact of rare observations. After calculating group-level weighted averages and variances by treating individuals separately to reduce pseudoreplication, our approach resulted in high statistical power despite small sample sizes. We fitted lizard adhesive performance and bite force data to the Weibull distribution and found that it closely estimated maximum performance in both cases, illustrating the generality of our approach. Using the Weibull distribution to estimate observed performance greatly improves upon previous techniques by facilitating power analyses and error estimations around robustly estimated maximum values. PMID:26994180

  10. Facial Performance Transfer via Deformable Models and Parametric Correspondence.

    PubMed

    Asthana, Akshay; de la Hunty, Miles; Dhall, Abhinav; Goecke, Roland

    2012-09-01

    The issue of transferring facial performance from one person's face to another's has been an area of interest for the movie industry and the computer graphics community for quite some time. In recent years, deformable face models, such as the Active Appearance Model (AAM), have made it possible to track and synthesize faces in real time. Not surprisingly, deformable face model-based approaches for facial performance transfer have gained tremendous interest in the computer vision and graphics community. In this paper, we focus on the problem of real-time facial performance transfer using the AAM framework. We propose a novel approach of learning the mapping between the parameters of two completely independent AAMs, using them to facilitate the facial performance transfer in a more realistic manner than previous approaches. The main advantage of modeling this parametric correspondence is that it allows a "meaningful" transfer of both the nonrigid shape and texture across faces irrespective of the speakers' gender, shape, and size of the faces, and illumination conditions. We explore linear and nonlinear methods for modeling the parametric correspondence between the AAMs and show that the sparse linear regression method performs the best. Moreover, we show the utility of the proposed framework for a cross-language facial performance transfer that is an area of interest for the movie dubbing industry. PMID:21931176

  11. Modeling the effects of contrast enhancement on target acquisition performance

    NASA Astrophysics Data System (ADS)

    Du Bosq, Todd W.; Fanning, Jonathan D.

    2008-04-01

    Contrast enhancement and dynamic range compression are currently being used to improve the performance of infrared imagers by increasing the contrast between the target and the scene content, by better utilizing the available gray levels either globally or locally. This paper assesses the range-performance effects of various contrast enhancement algorithms for target identification with well contrasted vehicles. Human perception experiments were performed to determine field performance using contrast enhancement on the U.S. Army RDECOM CERDEC NVESD standard military eight target set using an un-cooled LWIR camera. The experiments compare the identification performance of observers viewing linearly scaled images and various contrast enhancement processed images. Contrast enhancement is modeled in the US Army thermal target acquisition model (NVThermIP) by changing the scene contrast temperature. The model predicts improved performance based on any improved target contrast, regardless of feature saturation or enhancement. To account for the equivalent blur associated with each contrast enhancement algorithm, an additional effective MTF was calculated and added to the model. The measured results are compared with the predicted performance based on the target task difficulty metric used in NVThermIP.

  12. A stirling engine computer model for performance calculations

    NASA Technical Reports Server (NTRS)

    Tew, R.; Jefferies, K.; Miao, D.

    1978-01-01

    To support the development of the Stirling engine as a possible alternative to the automobile spark-ignition engine, the thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer. The modeling techniques used are presented. The performance of an existing rhombic-drive Stirling engine was simulated by use of this computer program, and some typical results are presented. Engine tests are planned in order to evaluate this model.

  13. Modeling and optimum time performance for concurrent processing

    NASA Astrophysics Data System (ADS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-08-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  14. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  15. Multi-Scale Multi-Dimensional Ion Battery Performance Model

    2007-05-07

    The Multi-Scale Multi-Dimensional (MSMD) Lithium Ion Battery Model allows for computer prediction and engineering optimization of thermal, electrical, and electrochemical performance of lithium ion cells with realistic geometries. The model introduces separate simulation domains for different scale physics, achieving much higher computational efficiency compared to the single domain approach. It solves a one dimensional electrochemistry model in a micro sub-grid system, and captures the impacts of macro-scale battery design factors on cell performance and materialmore » usage by solving cell-level electron and heat transports in a macro grid system.« less

  16. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  17. Tablet Personal Computer Integration in Higher Education: Applying the Unified Theory of Acceptance and Use Technology Model to Understand Supporting Factors

    ERIC Educational Resources Information Center

    Moran, Mark; Hawkes, Mark; El Gayar, Omar

    2010-01-01

    Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…

  18. Summary of the key features of seven biomathematical models of human fatigue and performance

    NASA Technical Reports Server (NTRS)

    Mallis, Melissa M.; Mejdal, Sig; Nguyen, Tammy T.; Dinges, David F.

    2004-01-01

    BACKGROUND: Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. METHODS: An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. RESULTS: Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers

  19. Disaggregation of Rainy Hours: Compared Performance of Various Models.

    NASA Astrophysics Data System (ADS)

    Ben Haha, M.; Hingray, B.; Musy, A.

    In the urban environment, the response times of catchments are usually short. To de- sign or to diagnose waterworks in that context, it is necessary to describe rainfall events with a good time resolution: a 10mn time step is often necessary. Such in- formation is not always available. Rainfall disaggregation models have thus to be applied to produce from rough rainfall data that short time resolution information. The communication will present the performance obtained with several rainfall dis- aggregation models that allow for the disaggregation of rainy hours into six 10mn rainfall amounts. The ability of the models to reproduce some statistical character- istics of rainfall (mean, variance, overall distribution of 10mn-rainfall amounts; ex- treme values of maximal rainfall amounts over different durations) is evaluated thanks to different graphical and numerical criteria. The performance of simple models pre- sented in some scientific papers or developed in the Hydram laboratory as well as the performance of more sophisticated ones is compared with the performance of the basic constant disaggregation model. The compared models are either deterministic or stochastic; for some of them the disaggregation is based on scaling properties of rainfall. The compared models are in increasing complexity order: constant model, linear model (Ben Haha, 2001), Ormsbee Deterministic model (Ormsbee, 1989), Ar- tificial Neuronal Network based model (Burian et al. 2000), Hydram Stochastic 1 and Hydram Stochastic 2 (Ben Haha, 2001), Multiplicative Cascade based model (Olsson and Berndtsson, 1998), Ormsbee Stochastic model (Ormsbee, 1989). The 625 rainy hours used for that evaluation (with a hourly rainfall amount greater than 5mm) were extracted from the 21 years chronological rainfall series (10mn time step) observed at the Pully meteorological station, Switzerland. The models were also evaluated when applied to different rainfall classes depending on the season first and on the

  20. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    SciTech Connect

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  1. Charge-coupled-device X-ray detector performance model

    NASA Technical Reports Server (NTRS)

    Bautz, M. W.; Berman, G. E.; Doty, J. P.; Ricker, G. R.

    1987-01-01

    A model that predicts the performance characteristics of CCD detectors being developed for use in X-ray imaging is presented. The model accounts for the interactions of both X-rays and charged particles with the CCD and simulates the transport and loss of charge in the detector. Predicted performance parameters include detective and net quantum efficiencies, split-event probability, and a parameter characterizing the effective thickness presented by the detector to cosmic-ray protons. The predicted performance of two CCDs of different epitaxial layer thicknesses is compared. The model predicts that in each device incomplete recovery of the charge liberated by a photon of energy between 0.1 and 10 keV is very likely to be accompanied by charge splitting between adjacent pixels. The implications of the model predictions for CCD data processing algorithms are briefly discussed.

  2. Performance evaluation of hydrological models: Statistical significance for reducing subjectivity in goodness-of-fit assessments

    NASA Astrophysics Data System (ADS)

    Ritter, Axel; Muñoz-Carpena, Rafael

    2013-02-01

    SummarySuccess in the use of computer models for simulating environmental variables and processes requires objective model calibration and verification procedures. Several methods for quantifying the goodness-of-fit of observations against model-calculated values have been proposed but none of them is free of limitations and are often ambiguous. When a single indicator is used it may lead to incorrect verification of the model. Instead, a combination of graphical results, absolute value error statistics (i.e. root mean square error), and normalized goodness-of-fit statistics (i.e. Nash-Sutcliffe Efficiency coefficient, NSE) is currently recommended. Interpretation of NSE values is often subjective, and may be biased by the magnitude and number of data points, data outliers and repeated data. The statistical significance of the performance statistics is an aspect generally ignored that helps in reducing subjectivity in the proper interpretation of the model performance. In this work, approximated probability distributions for two common indicators (NSE and root mean square error) are derived with bootstrapping (block bootstrapping when dealing with time series), followed by bias corrected and accelerated calculation of confidence intervals. Hypothesis testing of the indicators exceeding threshold values is proposed in a unified framework for statistically accepting or rejecting the model performance. It is illustrated how model performance is not linearly related with NSE, which is critical for its proper interpretation. Additionally, the sensitivity of the indicators to model bias, outliers and repeated data is evaluated. The potential of the difference between root mean square error and mean absolute error for detecting outliers is explored, showing that this may be considered a necessary but not a sufficient condition of outlier presence. The usefulness of the approach for the evaluation of model performance is illustrated with case studies including those with

  3. Improving Acceptance of Automated Counseling Procedures.

    ERIC Educational Resources Information Center

    Johnson, James H.; And Others

    This paper discusses factors that may influence the acceptance of automated counseling procedures by the military. A consensual model of the change process is presented which structures organizational readiness, the change strategy, and acceptance as integrated variables to be considered in a successful installation. A basic introduction to the…

  4. Modeling gas-liquid head performance of electrical submersible pumps

    NASA Astrophysics Data System (ADS)

    Sun, Datong

    The objectives of this study are to develop a simple and accurate theoretical model and to implement the model into a computational tool to predict Electrical Submersible Pumps (ESP) head performance under two-phase flow conditions. A new two-phase model including a set of one-dimensional mass and momentum balance equations was developed. The derived gas-liquid momentum equations along pump channels has improved Sachdeva (1992, 1994)'s model in petroleum industry and generalized Minemura (1998)'s model in nuclear industry. The resulting pressure ODE for frictionless incompressible single-phase flow is consistent with the pump Euler equation. In the two-phase momentum equations, new models for wall frictional losses for each phase, through using gas-liquid stratified assumption and existing correlations for impeller rotating effect, channel curvature effect, and channel cross section effect, have been proposed. New equations for radius of curvature along ESP channels, used in the curvature effect calculation, have been derived. A new shock loss model incorporating rotational speeds has been developed. A new correlation for drag coefficient and interfacial characteristic length effects has been obtained through fitting the model results with experimental data. An algorithm to solve the model equations has been developed and implemented. The model predicts pressure and void fraction distributions along impellers and diffusers and can also be used to predict the pump head performance curve under different fluid properties, pump intake conditions, and rotational speeds. The new two-phase model is validated with air-water experimental data. Results show the model provides a very good prediction for pump head performance under different gas flow rates, liquid flow rates, and different intake pressures. The new model is capable of predicting surging and gas lock conditions.

  5. The performance of different models of primary care provision in Southern Africa.

    PubMed

    Mills, Anne; Palmer, Natasha; Gilson, Lucy; McIntyre, Di; Schneider, Helen; Sinanovic, Edina; Wadee, Haroon

    2004-09-01

    Despite the emphasis placed during the last two decades on public delivery of comprehensive and equitable primary care (PC) to developing country populations, coverage remains far from universal and the quality often poor. Users frequently patronise private providers, ranging from informal drug sellers to trained professionals. Interest is increasing internationally in the potential for making better use of private providers, including contractual approaches. The research aim was to examine the performance of different models of PC provision, in order to identify their strengths and weaknesses from the perspective of a government wishing to develop an overall strategy for improving PC provision. Models evaluated were: (a) South African general practitioners (district surgeons) providing services under public contracts; (b) clinics provided in Lesotho under a sub-contract between a construction company and a South African health care company; (c) GP services provided through an Independent Practitioner Association to low income insured workers and families; (d) a private clinic chain serving low income insured and uninsured workers and their families; and (e) for comparative purposes, South African public clinics. Performance was analysed in terms of provider cost and quality (of infrastructure, treatment practices, acceptability to patients and communities), allowing for differences in services and case-mix. The diversity of the arrangements made direct comparisons difficult, however, clear differences were identified between the models and conclusions drawn on their relative performance and the influences upon performance. The study findings demonstrate that contextual features strongly influence provider performance, and that a crude public/private comparison is not helpful. Key issues in contract design likely to influence performance are highlighted. Finally, the study argues that there is a need before contracting out service provision to consider how the

  6. Predictive performance of a model of anaesthetic uptake with desflurane.

    PubMed

    Kennedy, R

    2006-04-01

    We have previously shown that a model of anaesthetic uptake and distribution, developed for use as a teaching tool, is able to predict end-tidal isoflurane and sevoflurane concentrations at least as well as commonly used propofol models predict blood levels of propofol. Models with good predictive performance may be useful as part of real-time prediction systems. The aim of this study was to assess the performance of this model with desflurane. Twenty adult patients undergoing routine anaesthesia were studied. The total fresh gas flow and vaporizor settings were collected at 10-second intervals from the anaesthetic machine. These data were used as inputs to the model, which had been initialized for patient weight and desflurane. Output of the model is a predicted end-tidal value at each point in time. These values were compared with measured end-tidal desflurane using a standard statistical technique of Varvel and colleagues. Data was analysed from 19 patients. Median performance error was 78% (95% CI 8-147), median absolute performance error 77% (6-149), divergence 10.6%/h (-80-101) and wobble 8.9% (-6-24). The predictive performance of this model with desflurane was poor, with considerable variability between patients. The reasons for the difference between desflurane and our previous results with isoflurane and sevoflurane are not obvious, but may provide important clues to the necessary components for such models. The data collected in this study may assist in the development and evaluation of improved models. PMID:16617640

  7. Assessing the model performance of an integrated hydrological and biogeochemical model for discharge and nitrate load predictions

    NASA Astrophysics Data System (ADS)

    Pohlert, T.; Breuer, L.; Huisman, J. A.; Frede, H.-G.

    2007-03-01

    In this study, we evaluate the performance of the SWAT-N model, a modified version of the widely used SWAT version, for discharge and nitrate predictions at the mesoscale Dill catchment (Germany) for a 5-year period. The underlying question is, whether the model efficiency is sufficient for scenario analysis of land-use changes on both water quantity and quality. The Shuffled Complex Evolution (SCE-UA) algorithm is used to calibrate the model for daily discharge at the catchments outlet. Model performance is assessed with a split-sampling as well as a proxy-basin test using recorded hydrographs of four additional gauges located within the catchment. The efficiency regarding nitrate load simulation is assessed without further calibration on a daily, log-daily, weekly, and monthly basis as compared to observations derived from an intensive sampling campaign conducted at the catchments outlet. A new approach is employed to test the spatial consistency of the model, where simulated longitudinal profiles of nitrate concentrations were compared with observed longitudinal profiles. It is concluded that the model efficiency of SWAT-N is sufficient for the assessment of scenarios for daily discharge predictions. SWAT-N can be employed without further calibration for nitrate load simulations on both a weekly and monthly basis with an acceptable degree of accuracy. However, the model efficiency for daily nitrate load is insufficient, which can be attributed to both data uncertainty (i.e. point-source effluents and actual farming practise) as well as structural errors. The simulated longitudinal profiles meet the observations reasonably well, which suggests that the model is spatially consistent.

  8. Assessing the model performance of an integrated hydrological and biogeochemical model for discharge and nitrate load predictions

    NASA Astrophysics Data System (ADS)

    Pohlert, T.; Breuer, L.; Huisman, J. A.; Frede, H.-G.

    2006-09-01

    In this study, we evaluate the performance of the SWAT-N model, a modified version of the widely used SWAT version, for discharge and nitrate predictions at the mesoscale Dill catchment for a 5-year period. The underlying question is, whether the model efficiency is sufficient for scenario analysis of land-use changes on both water quantity and quality. The Shuffled Complex Evolution (SCE-UA) algorithm is used to calibrate the model for daily discharge at the catchments outlet. Model performance is assessed with a split-sampling as well as a proxy-basin test using recorded hydrographs of four additional gauges located within the catchment. The efficiency regarding nitrate load simulation is assessed without further calibration on a daily, log-daily, weekly, and monthly basis as compared to observations derived from an intensive sampling campaign conducted at the catchments outlet. A new approach is employed to test the spatial consistency of the model, where simulated longitudinal profiles of nitrate concentrations were compared with observed longitudinal profiles. It is concluded that the model efficiency of SWAT-N is sufficient for the assessment of scenarios for daily discharge predictions. SWAT-N can be employed without further calibration for nitrate load simulations on both a weekly and monthly basis with an acceptable degree of accuracy. However, the model efficiency for daily nitrate load is insufficient, which can be attributed to both data uncertainty (i.e. point-source effluents and actual farming practise) as well as structural errors. The simulated longitudinal profiles meet the observations reasonably well, which suggests that the model is spatially consistent.

  9. Predicting waste stabilization pond performance using an ecological simulation model

    SciTech Connect

    New, G.R.

    1987-01-01

    Waste stabilization ponds (lagoons) are often favored in small communities because of their low cost and ease of operation. Most models currently used to predict performance are empirical or fail to address the primary lagoon cell. Empirical methods for predicting lagoon performance have been found to be off as much as 248 percent when used on a system other than the one they were developed for. Also, the present models developed for the primary cell lack the ability to predict parameters other than biochemical oxygen demand (BOD) and nitrogen. Oxygen consumption is usually estimated from BOD utilization. LAGOON is a fortran program which models the biogeochemical processes characteristic of the primary cell of facultative lagoons. Model parameters can be measured from lagoons in the vicinity of a proposed lagoon or estimated from laboratory studies. The model was calibrated utilizing a subset of the Corinne Utah lagoon data then validated utilizing a subset of the Corinne Utah data.

  10. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  11. Performance Models for Split-execution Computing Systems

    SciTech Connect

    Humble, Travis S; McCaskey, Alex; Schrock, Jonathan; Seddiqi, Hadayat; Britt, Keith A; Imam, Neena

    2016-01-01

    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardware limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.

  12. An in-depth review of photovoltaic system performance models

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. R.

    1984-01-01

    The features, strong points and shortcomings of 10 numerical models commonly applied to assessing photovoltaic performance are discussed. The models range in capabilities from first-order approximations to full circuit level descriptions. Account is taken, at times, of the cell and module characteristics, the orientation and geometry, array-level factors, the power-conditioning equipment, the overall plant performance, O and M effects, and site-specific factors. Areas of improvement and/or necessary extensions are identified for several of the models. Although the simplicity of a model was found not necessarily to affect the accuracy of the data generated, the use of any one model was dependent on the application.

  13. COMPASS: A Framework for Automated Performance Modeling and Prediction

    SciTech Connect

    Lee, Seyong; Meredith, Jeremy S; Vetter, Jeffrey S

    2015-01-01

    Flexible, accurate performance predictions offer numerous benefits such as gaining insight into and optimizing applications and architectures. However, the development and evaluation of such performance predictions has been a major research challenge, due to the architectural complexities. To address this challenge, we have designed and implemented a prototype system, named COMPASS, for automated performance model generation and prediction. COMPASS generates a structured performance model from the target application's source code using automated static analysis, and then, it evaluates this model using various performance prediction techniques. As we demonstrate on several applications, the results of these predictions can be used for a variety of purposes, such as design space exploration, identifying performance tradeoffs for applications, and understanding sensitivities of important parameters. COMPASS can generate these predictions across several types of applications from traditional, sequential CPU applications to GPU-based, heterogeneous, parallel applications. Our empirical evaluation demonstrates a maximum overhead of 4%, flexibility to generate models for 9 applications, speed, ease of creation, and very low relative errors across a diverse set of architectures.

  14. Human performance modeling for system of systems analytics.

    SciTech Connect

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E.; Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  15. Modeling of video compression effects on target acquisition performance

    NASA Astrophysics Data System (ADS)

    Cha, Jae H.; Preece, Bradley; Espinola, Richard L.

    2009-05-01

    The effect of video compression on image quality was investigated from the perspective of target acquisition performance modeling. Human perception tests were conducted recently at the U.S. Army RDECOM CERDEC NVESD, measuring identification (ID) performance on simulated military vehicle targets at various ranges. These videos were compressed with different quality and/or quantization levels utilizing motion JPEG, motion JPEG2000, and MPEG-4 encoding. To model the degradation on task performance, the loss in image quality is fit to an equivalent Gaussian MTF scaled by the Structural Similarity Image Metric (SSIM). Residual compression artifacts are treated as 3-D spatio-temporal noise. This 3-D noise is found by taking the difference of the uncompressed frame, with the estimated equivalent blur applied, and the corresponding compressed frame. Results show good agreement between the experimental data and the model prediction. This method has led to a predictive performance model for video compression by correlating various compression levels to particular blur and noise input parameters for NVESD target acquisition performance model suite.

  16. Modeling human performance with low light sparse color imagers

    NASA Astrophysics Data System (ADS)

    Haefner, David P.; Reynolds, Joseph P.; Cha, Jae; Hodgkin, Van

    2011-05-01

    Reflective band sensors are often signal to noise limited in low light conditions. Any additional filtering to obtain spectral information further reduces the signal to noise, greatly affecting range performance. Modern sensors, such as the sparse color filter CCD, circumvent this additional degradation through reducing the number of pixels affected by filters and distributing the color information. As color sensors become more prevalent in the warfighter arsenal, the performance of the sensor-soldier system must be quantified. While field performance testing ultimately validates the success of a sensor, accurately modeling sensor performance greatly reduces the development time and cost, allowing the best technology to reach the soldier the fastest. Modeling of sensors requires accounting for how the signal is affected through the modulation transfer function (MTF) and noise of the system. For the modeling of these new sensors, the MTF and noise for each color band must be characterized, and the appropriate sampling and blur must be applied. We show how sparse array color filter sensors may be modeled and how a soldier's performance with such a sensor may be predicted. This general approach to modeling color sensors can be extended to incorporate all types of low light color sensors.

  17. Advanced flight design systems subsystem performance models. Sample model: Environmental analysis routine library

    NASA Technical Reports Server (NTRS)

    Parker, K. C.; Torian, J. G.

    1980-01-01

    A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.

  18. Human performance modeling for system of systems analytics :soldier fatigue.

    SciTech Connect

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  19. An ambient agent model for analyzing managers' performance during stress

    NASA Astrophysics Data System (ADS)

    ChePa, Noraziah; Aziz, Azizi Ab; Gratim, Haned

    2016-08-01

    Stress at work have been reported everywhere. Work related performance during stress is a pattern of reactions that occurs when managers are presented with work demands that are not matched with their knowledge, skills, or abilities, and which challenge their ability to cope. Although there are many prior findings pertaining to explain the development of manager performance during stress, less attention has been given to explain the same concept through computational models. In such, a descriptive nature in psychological theories about managers' performance during stress can be transformed into a causal-mechanistic stage that explains the relationship between a series of observed phenomena. This paper proposed an ambient agent model for analyzing managers' performance during stress. Set of properties and variables are identified through past literatures to construct the model. Differential equations have been used in formalizing the model. Set of equations reflecting relations involved in the proposed model are presented. The proposed model is essential and can be encapsulated within an intelligent agent or robots that can be used to support managers during stress.

  20. Proficient brain for optimal performance: the MAP model perspective

    PubMed Central

    di Fronso, Selenia; Filho, Edson; Conforto, Silvia; Schmid, Maurizio; Bortoli, Laura; Comani, Silvia; Robazza, Claudio

    2016-01-01

    Background. The main goal of the present study was to explore theta and alpha event-related desynchronization/synchronization (ERD/ERS) activity during shooting performance. We adopted the idiosyncratic framework of the multi-action plan (MAP) model to investigate different processing modes underpinning four types of performance. In particular, we were interested in examining the neural activity associated with optimal-automated (Type 1) and optimal-controlled (Type 2) performances. Methods. Ten elite shooters (6 male and 4 female) with extensive international experience participated in the study. ERD/ERS analysis was used to investigate cortical dynamics during performance. A 4 × 3 (performance types × time) repeated measures analysis of variance was performed to test the differences among the four types of performance during the three seconds preceding the shots for theta, low alpha, and high alpha frequency bands. The dependent variables were the ERD/ERS percentages in each frequency band (i.e., theta, low alpha, high alpha) for each electrode site across the scalp. This analysis was conducted on 120 shots for each participant in three different frequency bands and the individual data were then averaged. Results. We found ERS to be mainly associated with optimal-automatic performance, in agreement with the “neural efficiency hypothesis.” We also observed more ERD as related to optimal-controlled performance in conditions of “neural adaptability” and proficient use of cortical resources. Discussion. These findings are congruent with the MAP conceptualization of four performance states, in which unique psychophysiological states underlie distinct performance-related experiences. From an applied point of view, our findings suggest that the MAP model can be used as a framework to develop performance enhancement strategies based on cognitive and neurofeedback techniques. PMID:27257557

  1. Proficient brain for optimal performance: the MAP model perspective.

    PubMed

    Bertollo, Maurizio; di Fronso, Selenia; Filho, Edson; Conforto, Silvia; Schmid, Maurizio; Bortoli, Laura; Comani, Silvia; Robazza, Claudio

    2016-01-01

    Background. The main goal of the present study was to explore theta and alpha event-related desynchronization/synchronization (ERD/ERS) activity during shooting performance. We adopted the idiosyncratic framework of the multi-action plan (MAP) model to investigate different processing modes underpinning four types of performance. In particular, we were interested in examining the neural activity associated with optimal-automated (Type 1) and optimal-controlled (Type 2) performances. Methods. Ten elite shooters (6 male and 4 female) with extensive international experience participated in the study. ERD/ERS analysis was used to investigate cortical dynamics during performance. A 4 × 3 (performance types × time) repeated measures analysis of variance was performed to test the differences among the four types of performance during the three seconds preceding the shots for theta, low alpha, and high alpha frequency bands. The dependent variables were the ERD/ERS percentages in each frequency band (i.e., theta, low alpha, high alpha) for each electrode site across the scalp. This analysis was conducted on 120 shots for each participant in three different frequency bands and the individual data were then averaged. Results. We found ERS to be mainly associated with optimal-automatic performance, in agreement with the "neural efficiency hypothesis." We also observed more ERD as related to optimal-controlled performance in conditions of "neural adaptability" and proficient use of cortical resources. Discussion. These findings are congruent with the MAP conceptualization of four performance states, in which unique psychophysiological states underlie distinct performance-related experiences. From an applied point of view, our findings suggest that the MAP model can be used as a framework to develop performance enhancement strategies based on cognitive and neurofeedback techniques. PMID:27257557

  2. Performance and combustion modeling of heterogeneous charge engines

    SciTech Connect

    Primus, R.J.; Wong, V.W.

    1985-01-01

    This paper reviews the phenomoneological modeling of the combustion processes for the diesel and fuel-injected stratified charge engines. Distinctions are made between phenomenological and multi-dimensional finite-difference approaches. The modeling methodologies and the basic components in these models are described. These include characterization of the fuel spray, fuel-air mixing, ignition, burning and heat transfer processes. An attempt is made in the paper to highlight the similarities and contrasts of various models and relate to their utility in addressing emission research and engine performance development objectives.

  3. A survey of university students' perceptions of learning management systems in a low-resource setting using a technology acceptance model.

    PubMed

    Chipps, Jennifer; Kerr, Jane; Brysiewicz, Petra; Walters, Fiona

    2015-02-01

    Learning management systems have been widely advocated for the support of distance learning. In low-resource settings, the uptake of these systems by students has been mixed. This study aimed to identify, through the use of the Technology Acceptance Model, the individual, organizational, and technological factors that could be influencing the use of learning management systems. A simple quantitative descriptive survey was conducted of nursing and health science students at a university in South Africa as part of their first exposure to a learning management system. A total of 274 respondents (56.7%) completed the survey questionnaire, made up of 213 nursing respondents (87.7%) and 61 health sciences respondents (25%). Overall, the respondents found the learning management system easy to use and useful for learning. There were significant differences between the two groups of respondents, with the respondents from health sciences being both younger and more computer literate. The nursing respondents, who received more support and orientations, reported finding the learning management system more useful. Recommendations are made for training and support to ensure uptake. PMID:25521789

  4. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  5. The influence of conceptual model structure on model performance: a comparative study for 237 French catchments

    NASA Astrophysics Data System (ADS)

    van Esse, W. R.; Perrin, C.; Booij, M. J.; Augustijn, D. C. M.; Fenicia, F.; Kavetski, D.; Lobligeois, F.

    2013-10-01

    Models with a fixed structure are widely used in hydrological studies and operational applications. For various reasons, these models do not always perform well. As an alternative, flexible modelling approaches allow the identification and refinement of the model structure as part of the modelling process. In this study, twelve different conceptual model structures from the SUPERFLEX framework are compared with the fixed model structure GR4H, using a large set of 237 French catchments and discharge-based performance metrics. The results show that, in general, the flexible approach performs better than the fixed approach. However, the flexible approach has a higher chance of inconsistent results when calibrated on two different periods. When analysing the subset of 116 catchments where the two approaches produce consistent performance over multiple time periods, their average performance relative to each other is almost equivalent. From the point of view of developing a well-performing fixed model structure, the findings favour models with parallel reservoirs and a power function to describe the reservoir outflow. In general, conceptual hydrological models perform better on larger and/or wetter catchments than on smaller and/or drier catchments. The model structures performed poorly when there were large climatic differences between the calibration and validation periods, in catchments with flashy flows, and in catchments with unexplained variations in low flow measurements.

  6. Towards A Complete Model Of Photopic Visual Threshold Performance

    NASA Astrophysics Data System (ADS)

    Overington, I.

    1982-02-01

    Based on a wide variety of fragmentary evidence taken from psycho-physics, neurophysiology and electron microscopy, it has been possible to put together a very widely applicable conceptual model of photopic visual threshold performance. Such a model is so complex that a single comprehensive mathematical version is excessively cumbersome. It is, however, possible to set up a suite of related mathematical models, each of limited application but strictly known envelope of usage. Such models may be used for assessment of a variety of facets of visual performance when using display imagery, including effects and interactions of image quality, random and discrete display noise, viewing distance, image motion, etc., both for foveal interrogation tasks and for visual search tasks. The specific model may be selected from the suite according to the assessment task in hand. The paper discusses in some depth the major facets of preperceptual visual processing and their interaction with instrumental image quality and noise. It then highlights the statistical nature of visual performance before going on to consider a number of specific mathematical models of partial visual function. Where appropriate, these are compared with widely popular empirical models of visual function.

  7. Thermal performance modeling of NASA s scientific balloons

    NASA Astrophysics Data System (ADS)

    Franco, H.; Cathey, H.

    The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed

  8. Cassini Radar EQM Model: Instrument Description and Performance Status

    NASA Technical Reports Server (NTRS)

    Borgarelli, L.; Faustini, E. Zampolini; Im, E.; Johnson, W. T. K.

    1996-01-01

    The spaeccraft of the Cassini Mission is planned to be launched towards Saturn in October 1997. The mission is designed to study the physical structure and chemical composition of Titan. The results of the tests performed on the Cassini radar engineering qualification model (EQM) are summarized. The approach followed in the verification and evaluation of the performance of the radio frequency subsystem EQM is presented. The results show that the instrument satisfies the relevant mission requirements.

  9. Development of Models To Relate Microbiological and Headspace Volatile Parameters in Stored Atlantic Salmon to Acceptance and Willingness To Prepare the Product by Senior Consumers.

    PubMed

    Erickson, Marilyn C; Ma, Li M; Doyle, Michael P

    2015-12-01

    Microbial spoilage of salmon occurs during extended refrigerated storage and is often accompanied by unpleasant aromas. When spoilage is detected, it is assumed that consumers will reject the product for consumption. Because sensory panels of trained individuals or consumers are expensive and labor intensive, identification of microbiological or chemical indicators to characterize the extent to which fish has spoiled is needed when experimental process and storage treatments are being evaluated. A consumer panel of 53 senior citizens (60 to 85 years of age) evaluated in duplicate raw salmon subjected to 10 storage conditions, and the fish quality was targeted to range from fresh to very spoiled. This population group was chosen because they would be expected to have a greater prevalence of olfactory impairments and higher odor thresholds than the general population; in turn, a shorter safety margin or time period between product rejection due to spoilage and the generation of Clostridium botulinum toxins would be likely. Low hedonic scores for aroma and overall acceptance (2 or 3 of 9), corresponding to "dislike very much" to "dislike moderately," did not equate with unwillingness to prepare the sample for consumption by up to seven panelists (13%) when the product was presumed to have already been purchased. Despite these outliers, significant models (P = 0.0000) were developed for the willingness of consumers to prepare the sample for consumption and the sample's aerobic and anaerobic microbiological populations and two volatile peaks with Kovats indices of 640 and 753. However, these models revealed that the levels of microbiological and chemical markers must be very high before some consumers would reject the sample; hence, spoilage detection by smell would likely not be an adequate safeguard against consuming salmon in which C. botulinum toxin had been generated. PMID:26613910

  10. New performance evaluation models for character detection in images

    NASA Astrophysics Data System (ADS)

    Wang, YanWei; Ding, XiaoQing; Liu, ChangSong; Wang, Kongqiao

    2010-02-01

    Detection of characters regions is a meaningful research work for both highlighting region of interest and recognition for further information processing. A lot of researches have been performed on character localization and extraction and this leads to the great needs of performance evaluation scheme to inspect detection algorithms. In this paper, two probability models are established to accomplish evaluation tasks for different applications respectively. For highlighting region of interest, a Gaussian probability model, which simulates the property of a low-pass Gaussian filter of human vision system (HVS), was constructed to allocate different weights to different character parts. It reveals the greatest potential to describe the performance of detectors, especially, when the result detected is an incomplete character, where other methods cannot effectively work. For the recognition destination, we also introduced a weighted probability model to give an appropriate description for the contribution of detection results to final recognition results. The validity of performance evaluation models proposed in this paper are proved by experiments on web images and natural scene images. These models proposed in this paper may also be able to be applied in evaluating algorithms of locating other objects, like face detection and more wide experiments need to be done to examine the assumption.

  11. Optimization of wind farm performance using low-order models

    NASA Astrophysics Data System (ADS)

    Dabiri, John; Brownstein, Ian

    2015-11-01

    A low order model that captures the dominant flow behaviors in a vertical-axis wind turbine (VAWT) array is used to maximize the power output of wind farms utilizing VAWTs. The leaky Rankine body model (LRB) was shown by Araya et al. (JRSE 2014) to predict the ranking of individual turbine performances in an array to within measurement uncertainty as compared to field data collected from full-scale VAWTs. Further, this model is able to predict array performance with significantly less computational expense than higher fidelity numerical simulations of the flow, making it ideal for use in optimization of wind farm performance. This presentation will explore the ability of the LRB model to rank the relative power output of different wind turbine array configurations as well as the ranking of individual array performance over a variety of wind directions, using various complex configurations tested in the field and simpler configurations tested in a wind tunnel. Results will be presented in which the model is used to determine array fitness in an evolutionary algorithm seeking to find optimal array configurations given a number of turbines, area of available land, and site wind direction profile. Comparison with field measurements will be presented.

  12. Performance verification tests of JT-60SA CS model coil

    NASA Astrophysics Data System (ADS)

    Obana, Tetsuhiro; Murakami, Haruyuki; Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku; Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi

    2015-11-01

    As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb3Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of -0.62% for the Nb3Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  13. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  14. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    SciTech Connect

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  15. Design modeling of lithium-ion battery performance

    NASA Astrophysics Data System (ADS)

    Nelson, Paul; Bloom, Ira; Amine, Khalil; Henriksen, Gary

    A computer design modeling technique has been developed for lithium-ion batteries to assist in setting goals for cell components, assessing materials requirements, and evaluating thermal management strategies. In this study, the input data for the model included design criteria from Quallion, LLC for Gen-2 18650 cells, which were used to test the accuracy of the dimensional modeling. Performance measurements on these cells were done at the electrochemical analysis and diagnostics laboratory (EADL) at Argonne National Laboratory. The impedance and capacity related criteria were calculated from the EADL measurements. Five batteries were designed for which the number of windings around the cell core was increased for each succeeding battery to study the effect of this variable upon the dimensions, weight, and performance of the batteries. The lumped-parameter battery model values were calculated for these batteries from the laboratory results, with adjustments for the current collection resistance calculated for the individual batteries.

  16. Design modeling of lithium-ion battery performance.

    SciTech Connect

    Nelson, P. A.; Bloom, I.; Amine, K.; Henriksen, G.; Chemical Engineering

    2002-08-22

    A computer design modeling technique has been developed for lithium-ion batteries to assist in setting goals for cell components, assessing materials requirements, and evaluating thermal management strategies. In this study, the input data for the model included design criteria from Quallion, LLC for Gen-2 18650 cells, which were used to test the accuracy of the dimensional modeling. Performance measurements on these cells were done at the electrochemical analysis and diagnostics laboratory (EADL) at Argonne National Laboratory. The impedance and capacity related criteria were calculated from the EADL measurements. Five batteries were designed for which the number of windings around the cell core was increased for each succeeding battery to study the effect of this variable upon the dimensions, weight, and performance of the batteries. The lumped-parameter battery model values were calculated for these batteries from the laboratory results, with adjustments for the current collection resistance calculated for the individual batteries.

  17. Cost and Performance Model for Redox Flow Batteries

    SciTech Connect

    Viswanathan, Vilayanur V.; Crawford, Aladsair J.; Stephenson, David E.; Kim, Soowhan; Wang, Wei; Li, Bin; Coffey, Greg W.; Thomsen, Edwin C.; Graff, Gordon L.; Balducci, Patrick J.; Kintner-Meyer, Michael CW; Sprenkle, Vincent L.

    2014-02-01

    A cost model was developed for all vanadium and iron-vanadium redox flow batteries. Electrochemical performance modeling was done to estimate stack performance at various power densities as a function of state of charge. This was supplemented with a shunt current model and a pumping loss model to estimate actual system efficiency. The operating parameters such as power density, flow rates and design parameters such as electrode aspect ratio, electrolyte flow channel dimensions were adjusted to maximize efficiency and minimize capital costs. Detailed cost estimates were obtained from various vendors to calculate cost estimates for present, realistic and optimistic scenarios. The main drivers for cost reduction for various chemistries were identified as a function of the energy to power ratio of the storage system. Levelized cost analysis further guided suitability of various chemistries for different applications.

  18. Human performance models for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  19. Titan I propulsion system modeling and possible performance improvements

    NASA Astrophysics Data System (ADS)

    Giusti, Oreste

    This thesis features the Titan I propulsion systems and offers data-supported suggestions for improvements to increase performance. The original propulsion systems were modeled both graphically in CAD and via equations. Due to the limited availability of published information, it was necessary to create a more detailed, secondary set of models. Various engineering equations---pertinent to rocket engine design---were implemented in order to generate the desired extra detail. This study describes how these new models were then imported into the ESI CFD Suite. Various parameters are applied to these imported models as inputs that include, for example, bi-propellant combinations, pressure, temperatures, and mass flow rates. The results were then processed with ESI VIEW, which is visualization software. The output files were analyzed for forces in the nozzle, and various results were generated, including sea level thrust and ISP. Experimental data are provided to compare the original engine configuration models to the derivative suggested improvement models.

  20. Software life cycle dynamic simulation model: The organizational performance submodel

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  1. Performance assessment of engineered barriers using the vault model

    SciTech Connect

    Johnson, L.H.

    1993-12-31

    The Vault Model for assessing engineered barrier performance has been developed as part of the preparation of an Environmental Impact Statement to be presented to a Federal Environmental Assessment Review Panel reviewing the Canadian nuclear fuel waste disposal concept. The model describes the behavior of titanium containers, radionuclide release from used fuel, and migration of radionuclides through buffer and backfill materials and into the surrounding geosphere. Vault Model simulations have shown that the release of radionuclides from the engineered barrier system is dominated by the release from the fuel-sheath gap and grain boundaries in used fuel. Sensitivity and uncertainty analyses have illustrated how releases from the vault are affected by both the uncertainty in model parameters and the assumptions made in the development of the models. It is likely that the combined effects of a number of conservatisms in the model result in the releases from the engineered barrier system being overpredicted by several orders of magnitude.

  2. Modelling and performance analysis of four and eight element TCAS

    NASA Technical Reports Server (NTRS)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  3. Solid rocket booster performance evaluation model. Volume 1: Engineering description

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The space shuttle solid rocket booster performance evaluation model (SRB-II) is made up of analytical and functional simulation techniques linked together so that a single pass through the model will predict the performance of the propulsion elements of a space shuttle solid rocket booster. The available options allow the user to predict static test performance, predict nominal and off nominal flight performance, and reconstruct actual flight and static test performance. Options selected by the user are dependent on the data available. These can include data derived from theoretical analysis, small scale motor test data, large motor test data and motor configuration data. The user has several options for output format that include print, cards, tape and plots. Output includes all major performance parameters (Isp, thrust, flowrate, mass accounting and operating pressures) as a function of time as well as calculated single point performance data. The engineering description of SRB-II discusses the engineering and programming fundamentals used, the function of each module, and the limitations of each module.

  4. Modeling and dosimetric performance evaluation of the RayStation treatment planning system.

    PubMed

    Mzenda, Bongile; Mugabe, Koki V; Sims, Rick; Godwin, Guy; Loria, Dayan

    2014-01-01

    The physics modeling, dose calculation accuracy and plan quality assessment of the RayStation (v3.5) treatment planning system (TPS) is presented in this study, with appropriate comparisons to the more established Pinnacle (v9.2) TPS. Modeling and validation for the Elekta MLCi and Agility beam models resulted in a good match to treatment machine-measured data based on tolerances of 3% for in-field and out-of-field regions, 10% for buildup and penumbral regions, and a gamma 2%/2mm dose/distance acceptance criteria. TPS commissioning using a wide range of appropriately selected dosimetry equipment, and following published guidelines, established the MLC modeling and dose calculation accuracy to be within standard tolerances for all tests performed. In both homogeneous and heterogeneous mediums, central axis calculations agreed with measurements within 2% for open fields and 3% for wedged fields, and within 4% off-axis. Treatment plan comparisons for identical clinical goals were made to Pinnacle for the following complex clinical cases: hypofractionated non-small cell lung carcinoma, head and neck, stereotactic spine, as well as for several standard clinical cases comprising of prostate, brain, and breast plans. DVHs, target, and critical organ doses, as well as measured point doses and gamma indices, applying both local and global (Van Dyk) normalization at 2%/2 mm and 3%/3 mm (10% lower threshold) acceptance criteria for these composite plans were assessed. In addition 3DVH was used to compare the perturbed dose distributions to the TPS 3D dose distributions. For all 32 cases, the patients QA checks showed > 95% of pixels passing 3% global/3mm gamma. PMID:25207563

  5. Acceptance Test Report for 241-U compressed air system

    SciTech Connect

    Freeman, R.D.

    1994-10-20

    This Acceptance Test Report (ATR) documents the results of acceptance testing of a newly upgraded compressed air system at 241-U Farm. The system was installed and the test successfully performed under work package 2W-92-01027.

  6. Advanced terahertz imaging system performance model for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Redman, Brian; Espinola, Richard L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.; Griffin, Steven T.; Halford, Carl E.; Reynolds, Joe

    2007-04-01

    The U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) and the U.S. Army Research Laboratory (ARL) have developed a terahertz-band imaging system performance model for detection and identification of concealed weaponry. The details of this MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium. The focus of this paper is to report on recent advances to the base model which have been designed to more realistically account for the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system. The advanced terahertz-band imaging system performance model now also accounts for target and background thermal emission, and has been recast into a user-friendly, Windows-executable tool. This advanced THz model has been developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will describe the advanced THz model and its new radiometric sub-model in detail, and provide modeling and experimental results on target observability as a function of target and background orientation.

  7. Thermal radiant exitance model performance: Soils and forests

    SciTech Connect

    Balick, L.K.; Smith, J.A.

    1995-12-31

    Models of surface temperatures of two land surface types based on their energy budgets were developed to simulate the effects of environmental factors on thermal radiant exitance. The performance of these models is examined in detail. One model solves the non-linear differential equation for heat diffusion in solids using a set of submodels for surface energy budget components. The model performance is examined under three desert conditions thought to be a strong test of the submodels. The accuracy of the temperature predictions and submodels is described. The accuracy of the model is generally good but some discrepancies between some of the submodels and measurements are noted. The sensitivity of the submodels is examined and is seen to be strongly controlled by interaction and feedback among energy components that are a function of surface temperature. The second model simulates vegetation canopies with detailed effects of surface geometry on radiant transfer in the canopy. Foliage solar absorption coefficients are calculated using a radiosity approach for a three layer canopy and long wave fluxes are modeled using a view factor matrix. Sensible and latent heat transfer through the canopy are also simulated using, nearby meteorological data but heat storage in the canopy is not included. Simulations for a coniferous forest canopy are presented and the sensitivity of the model to environmental inputs is discussed.

  8. Thermal radiant exitance model performance: soils and forests

    NASA Astrophysics Data System (ADS)

    Balick, Lee K.; Smith, James A.

    1995-01-01

    Models of surface temperatures of two land surface types based on their energy budgets were developed to simulate the effects of environmental factors on thermal radiant exitance. The performance of these models is examined in detail. One model solves the non-linear differential equation for heat diffusion in solids using a set of submodels for surface energy budget components. The model performance is examined under three desert conditions thought to be a strong test of the submodels. The accuracy of the temperature predictions and submodels is described. The accuracy of the model is generally good but some discrepancies between some of the submodels and measurements are noted. The sensitivity of the submodels is examined and is seen to be strongly controlled by interaction and feedback among energy components that are a function of surface temperature. The second model simulates vegetation canopies with detailed effects of surface geometry on radiant transfer in the canopy. Foliage solar absorption coefficients are calculated using a radiosity approach for a three layer canopy and long wave fluxes are modeled using a view factor matrix. Sensible and latent heat transfer through the canopy are also simulated using nearby meteorological data but heat storage in the canopy is not included. Simulations for a coniferous forest canopy are presented and the sensitivity of the model to environmental inputs is discussed.

  9. Spatial frequency dependence of target signature for infrared performance modeling

    NASA Astrophysics Data System (ADS)

    Du Bosq, Todd; Olson, Jeffrey

    2011-05-01

    The standard model used to describe the performance of infrared imagers is the U.S. Army imaging system target acquisition model, based on the targeting task performance metric. The model is characterized by the resolution and sensitivity of the sensor as well as the contrast and task difficulty of the target set. The contrast of the target is defined as a spatial average contrast. The model treats the contrast of the target set as spatially white, or constant, over the bandlimit of the sensor. Previous experiments have shown that this assumption is valid under normal conditions and typical target sets. However, outside of these conditions, the treatment of target signature can become the limiting factor affecting model performance accuracy. This paper examines target signature more carefully. The spatial frequency dependence of the standard U.S. Army RDECOM CERDEC Night Vision 12 and 8 tracked vehicle target sets is described. The results of human perception experiments are modeled and evaluated using both frequency dependent and independent target signature definitions. Finally the function of task difficulty and its relationship to a target set is discussed.

  10. Performance model for large area solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Klotz, Dino; Schmidt, Jan Philipp; Weber, André; Ivers-Tiffée, Ellen

    2014-08-01

    A parameter set obtained from a 1 cm2 size electrode cell is used to develop and calibrate a one-dimensional spatially resolved model. It is demonstrated that this performance model precalculates the evolving operating parameters along the gas channel of a large-sized cell. Input parameters are: (i) number of discretization elements N, accounting for anodic gas conversion, (ii) anodic gas flow rate and composition and (iv) operating voltage. The model calculations based on data from the 1 cm2 cell are scaled to be equivalent to a larger cell with 16 cm2 electrode size which is used to validate the performance model. The current/voltage characteristics can be predicted very accurately, even when anodic gas flow rates vary by as much as a factor of four. The performance model presented herein simulates the total overvoltage and does so in a broad range of operation conditions. This is done with an accuracy of the simulated current better than 6.1% for UOP = 0.85 V, 3.8% for UOP = 0.8 V and 3.7% for UOP = 0.75 V. It is hoped that these equations will form the basis of a greater model, capable of predicting all the conditions found throughout any industrial stack.

  11. Evaluating performances of simplified physically based models for landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  12. Accept or divert?

    PubMed

    Angelucci, P A

    1999-09-01

    Stretching scarce resources is more than a managerial issue. Should you accept the patient to an understaffed ICU or divert him to another facility? The intense "medical utility" controversy focuses on a situation that critical care nurses now face every day. PMID:10614370

  13. Approaches to acceptable risk

    SciTech Connect

    Whipple, C.

    1997-04-30

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.

  14. 1984 Newbery Acceptance Speech.

    ERIC Educational Resources Information Center

    Cleary, Beverly

    1984-01-01

    This acceptance speech for an award honoring "Dear Mr. Henshaw," a book about feelings of a lonely child of divorce intended for eight-, nine-, and ten-year-olds, highlights children's letters to author. Changes in society that affect children, the inception of "Dear Mr. Henshaw," and children's reactions to books are highlighted. (EJS)

  15. Why was Relativity Accepted?

    NASA Astrophysics Data System (ADS)

    Brush, S. G.

    Historians of science have published many studies of the reception of Einstein's special and general theories of relativity. Based on a review of these studies, and my own research on the role of the light-bending prediction in the reception of general relativity, I discuss the role of three kinds of reasons for accepting relativity (1) empirical predictions and explanations; (2) social-psychological factors; and (3) aesthetic-mathematical factors. According to the historical studies, acceptance was a three-stage process. First, a few leading scientists adopted the special theory for aesthetic-mathematical reasons. In the second stage, their enthusiastic advocacy persuaded other scientists to work on the theory and apply it to problems currently of interest in atomic physics. The special theory was accepted by many German physicists by 1910 and had begun to attract some interest in other countries. In the third stage, the confirmation of Einstein's light-bending prediction attracted much public attention and forced all physicists to take the general theory of relativity seriously. In addition to light-bending, the explanation of the advance of Mercury's perihelion was considered strong evidence by theoretical physicists. The American astronomers who conducted successful tests of general relativity became defenders of the theory. There is little evidence that relativity was `socially constructed' but its initial acceptance was facilitated by the prestige and resources of its advocates.

  16. Modeling Ni-Cd performance. Planned alterations to the Goddard battery model

    NASA Technical Reports Server (NTRS)

    Jagielski, J. M.

    1986-01-01

    The Goddard Space Flight Center (GSFC) currently has a preliminary computer model to simulate a Nickel Cadmium (Ni-Cd) performance. The basic methodology of the model was described in the paper entitled Fundamental Algorithms of the Goddard Battery Model. At present, the model is undergoing alterations to increase its efficiency, accuracy, and generality. A review of the present battery model is given, and the planned charges of the model are described.

  17. Atmospheric River Model Simulation Diagnostics and Performance Metrics

    NASA Astrophysics Data System (ADS)

    Waliser, D. E.; Guan, B.; Kim, J.; Leung, L. R.; Ralph, F. M.

    2014-12-01

    Atmospheric Rivers (ARs) are narrow, elongated, synoptic jets of water vapor. These systems account for over 90% of the poleward transport of water vapor in mid-latitudes and thus are a key mechanism in help establish the water and energy cycles of the planet. Many of the intense wintertime hydrological (flood and drought-ending precipitation) events in the US western states (as well as in other continents) occur in conjunction with land-falling AR events. Despite the important role of the ARs in our climate and weather systems, there have been few broad characterizations of model performance of ARs for global weather and climate models (GCMs), in terms of their role in global climate or impacts associated with extreme weather. Part of the challenge has been the lack of a comprehensive set of observation-based model simulation diagnostics and performance metrics. Based on the objectives and support from three activities: 1) the CalWater 2 AR project, 2) the Year of Tropical Convection (YOTC) and GEWEX Atmospheric System Study (GASS) multi-model experiment on Vertical Structure and Physical Processes of Weather & Climate, and 3) a new NASA effort examining the value added by dynamic regional climate model (RCM) downscaling, we are working to develop a comprehensive set of AR simulation diagnostics and model performance metrics for RCMs and GCMs. Application of these diagnostics and metrics will afford: 1) a baseline characterization of model representations of synoptic features, impacts, and multi-scale interactions, 2) an ability to guide model development and assess proposed improvements, 3) quantify the evolution in forecast skill, as well as 4) estimate predictability of AR characteristics and impacts. The purpose of this presentation is to initiate a more formal dialogue of this activity with the community, present a preliminary set of diagnostics/metrics and illustrate their utility through application to the 27 GCMs that contributed simulations to the YOTC

  18. New Mechanical Model for the Transmutation Fuel Performance Code

    SciTech Connect

    Gregory K. Miller

    2008-04-01

    A new mechanical model has been developed for implementation into the TRU fuel performance code. The new model differs from the existing FRAPCON 3 model, which it is intended to replace, in that it will include structural deformations (elasticity, plasticity, and creep) of the fuel. Also, the plasticity algorithm is based on the “plastic strain–total strain” approach, which should allow for more rapid and assured convergence. The model treats three situations relative to interaction between the fuel and cladding: (1) an open gap between the fuel and cladding, such that there is no contact, (2) contact between the fuel and cladding where the contact pressure is below a threshold value, such that axial slippage occurs at the interface, and (3) contact between the fuel and cladding where the contact pressure is above a threshold value, such that axial slippage is prevented at the interface. The first stage of development of the model included only the fuel. In this stage, results obtained from the model were compared with those obtained from finite element analysis using ABAQUS on a problem involving elastic, plastic, and thermal strains. Results from the two analyses showed essentially exact agreement through both loading and unloading of the fuel. After the cladding and fuel/clad contact were added, the model demonstrated expected behavior through all potential phases of fuel/clad interaction, and convergence was achieved without difficulty in all plastic analysis performed. The code is currently in stand alone form. Prior to implementation into the TRU fuel performance code, creep strains will have to be added to the model. The model will also have to be verified against an ABAQUS analysis that involves contact between the fuel and cladding.

  19. The influence of conceptual model structure on model performance: a comparative study for 237 French catchments

    NASA Astrophysics Data System (ADS)

    van Esse, W. R.; Perrin, C.; Booij, M. J.; Augustijn, D. C. M.; Fenicia, F.; Lobligeois, F.

    2013-04-01

    In hydrological studies models with a fixed structure are commonly used. For various reasons, these models do not always perform well. As an alternative, a flexible modelling approach could be followed, where the identification of the model structure is part of the model set-up procedure. In this study, the performance of twelve different conceptual model structures from the SUPERFLEX framework with varying complexity and the fixed model structure of GR4H were compared on a large set of 237 French catchments. The results showed that in general the flexible approach performs better than the fixed approach. However, the flexible approach has a higher chance of inconsistent results when implemented on two different periods. The same holds for more complex model structures. When for practical reasons a fixed model structure is preferred, this study shows that models with parallel reservoirs and a power function to describe the reservoir outflow perform best. In general, conceptual hydrological models perform better on large or wet catchments than on small or dry catchments. The model structures performed poorly when there was a climatic difference between the calibration and validation period, for catchments with flashy flows or disturbances in low flow measurements.

  20. Metallic Rotor Sizing and Performance Model for Flywheel Systems

    NASA Technical Reports Server (NTRS)

    Moore, Camille J.; Kraft, Thomas G.

    2012-01-01

    The NASA Glenn Research Center (GRC) is developing flywheel system requirements and designs for terrestrial and spacecraft applications. Several generations of flywheels have been designed and tested at GRC using in-house expertise in motors, magnetic bearings, controls, materials and power electronics. The maturation of a flywheel system from the concept phase to the preliminary design phase is accompanied by maturation of the Integrated Systems Performance model, where estimating relationships are replaced by physics based analytical techniques. The modeling can incorporate results from engineering model testing and emerging detail from the design process.

  1. Performance and Cognitive Assessment in 3-D Modeling

    ERIC Educational Resources Information Center

    Fahrer, Nolan E.; Ernst, Jeremy V.; Branoff, Theodore J.; Clark, Aaron C.

    2011-01-01

    The purpose of this study was to investigate identifiable differences between performance and cognitive assessment scores in a 3-D modeling unit of an engineering drafting course curriculum. The study aimed to provide further investigation of the need of skill-based assessments in engineering/technical graphics courses to potentially increase…

  2. Range performance impact of noise for thermal system modeling

    NASA Astrophysics Data System (ADS)

    Fanning, Jonathan D.; Teaney, Brian P.; Reynolds, Joseph P.; Du Bosq, Todd W.

    2009-05-01

    This paper presents a comparison of the predictions of NVThermIP to human perception experiment results in the presence of large amounts of noise where the signal to noise ratio is around 1. First, the calculations used in the NVESD imager performance models that deal with sensor noise are described outlining a few errors that appear in the NVThermIP code. A perception experiment is designed to test the range performance predictions of NVThermIP with varying amounts of noise and varying frame rates. NVThermIP is found to overestimate the impact of noise, leading to pessimistic range performance predictions for noisy systems. The perception experiment results are used to find a best fit value of the constant α used to relate system noise to eye noise in the NVESD models. The perception results are also fit to an alternate eye model that handles frame rates below 30Hz and smoothly approaches an accurate prediction of the performance in the presence of static noise. The predictions using the fit data show significantly less error than the predictions from the current model.

  3. Cognitive, Affective, and Behavioral Determinants of Performance: A Process Model.

    ERIC Educational Resources Information Center

    Dorfman, Peter W.; Stephan, Walter G.

    Literature from organizational and social psychology has suggested that three types of factors influence performance, i.e., cognitive, affective and behavioral. A model was developed to test a set of propositions concerning the relationship between the three kinds of factors, and included attributions, expectancies, general emotional responses to…

  4. An Integrative Model of Factors Related to Computing Course Performance.

    ERIC Educational Resources Information Center

    Charlton, John P.; Birkett, Paul E.

    1999-01-01

    A path-modeling approach is adopted to examine interrelationships between factors influencing computing behavior and computer course performance. Factors considered are gender, personality, intellect and computer attitudes, ownership, and experience. Intrinsic motivation is suggested as a major factor which can explain many variables' relationship…

  5. A Model of Physical Performance for Occupational Tasks.

    ERIC Educational Resources Information Center

    Hogan, Joyce

    This report acknowledges the problems faced by industrial/organizational psychologists who must make personnel decisions involving physically demanding jobs. The scarcity of criterion-related validation studies and the difficulty of generalizing validity are considered, and a model of physical performance that builds on Fleishman's (1984)…

  6. Toward a Model of Strategies and Summary Writing Performance

    ERIC Educational Resources Information Center

    Yang, Hui-Chun

    2014-01-01

    This study explores the construct of a summarization test task by means of single-group and multigroup structural equation modeling (SEM). It examines the interrelationships between strategy use and performance, drawing on data from 298 Taiwanese undergraduates' summary essays and their self-reported strategy use. Single-group SEM analyses…

  7. Successfully Using HPT Internationally: An International Performance Model

    ERIC Educational Resources Information Center

    Maeso, Eileen D.

    2011-01-01

    This article introduces a new international model that focuses on culture while including familiar elements of human performance technology (HPT). HPT adaptation for cultural differences is an essential part of our profession. We must be sensitive and flexible to succeed in an ever-changing global environment. (Contains 1 figure.)

  8. Towards a Social Networks Model for Online Learning & Performance

    ERIC Educational Resources Information Center

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  9. Stutter-Step Models of Performance in School

    ERIC Educational Resources Information Center

    Morgan, Stephen L.; Leenman, Theodore S.; Todd, Jennifer J.; Kentucky; Weeden, Kim A.

    2013-01-01

    To evaluate a stutter-step model of academic performance in high school, this article adopts a unique measure of the beliefs of 12,591 high school sophomores from the Education Longitudinal Study, 2002-2006. Verbatim responses to questions on occupational plans are coded to capture specific job titles, the listing of multiple jobs, and the listing…

  10. Item Response Theory Models for Performance Decline during Testing

    ERIC Educational Resources Information Center

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  11. Temporal diagnostic analysis of the SWAT model to detect dominant periods of poor model performance

    NASA Astrophysics Data System (ADS)

    Guse, Björn; Reusser, Dominik E.; Fohrer, Nicola

    2013-04-01

    Hydrological models generally include thresholds and non-linearities, such as snow-rain-temperature thresholds, non-linear reservoirs, infiltration thresholds and the like. When relating observed variables to modelling results, formal methods often calculate performance metrics over long periods, reporting model performance with only few numbers. Such approaches are not well suited to compare dominating processes between reality and model and to better understand when thresholds and non-linearities are driving model results. We present a combination of two temporally resolved model diagnostic tools to answer when a model is performing (not so) well and what the dominant processes are during these periods. We look at the temporal dynamics of parameter sensitivities and model performance to answer this question. For this, the eco-hydrological SWAT model is applied in the Treene lowland catchment in Northern Germany. As a first step, temporal dynamics of parameter sensitivities are analyzed using the Fourier Amplitude Sensitivity test (FAST). The sensitivities of the eight model parameters investigated show strong temporal variations. High sensitivities were detected for two groundwater (GW_DELAY, ALPHA_BF) and one evaporation parameters (ESCO) most of the time. The periods of high parameter sensitivity can be related to different phases of the hydrograph with dominances of the groundwater parameters in the recession phases and of ESCO in baseflow and resaturation periods. Surface runoff parameters show high parameter sensitivities in phases of a precipitation event in combination with high soil water contents. The dominant parameters give indication for the controlling processes during a given period for the hydrological catchment. The second step included the temporal analysis of model performance. For each time step, model performance was characterized with a "finger print" consisting of a large set of performance measures. These finger prints were clustered into

  12. Prediction modeling of physiological responses and human performance in the heat with application to space operations

    NASA Technical Reports Server (NTRS)

    Pandolf, Kent B.; Stroschein, Leander A.; Gonzalez, Richard R.; Sawka, Michael N.

    1994-01-01

    This institute has developed a comprehensive USARIEM heat strain model for predicting physiological responses and soldier performance in the heat which has been programmed for use by hand-held calculators, personal computers, and incorporated into the development of a heat strain decision aid. This model deals directly with five major inputs: the clothing worn, the physical work intensity, the state of heat acclimation, the ambient environment (air temperature, relative humidity, wind speed, and solar load), and the accepted heat casualty level. In addition to predicting rectal temperature, heart rate, and sweat loss given the above inputs, our model predicts the expected physical work/rest cycle, the maximum safe physical work time, the estimated recovery time from maximal physical work, and the drinking water requirements associated with each of these situations. This model provides heat injury risk management guidance based on thermal strain predictions from the user specified environmental conditions, soldier characteristics, clothing worn, and the physical work intensity. If heat transfer values for space operations' clothing are known, NASA can use this prediction model to help avoid undue heat strain in astronauts during space flight.

  13. Performance and competence models for audiovisual data fusion

    NASA Astrophysics Data System (ADS)

    Kabre, Harouna

    1995-09-01

    We describe two Artificial Neural Network (ANN) Models for Audio-visual Data Fusion. For the first model, we start an ANN training with an a-priori chosen static architecture together with a set of weighting parameters for the visual and for the auditory paths. Those weighting parameters, called attentional parameters, are tuned to achieve best performance even if the acoustic environment changes. This model is called the Performance Model (PM). For the second model, we start without any unit in the hidden layer of the ANN. Then we incrementally add new units which are partially connected to either the visual path or to the auditory one, and we reiterate this procedure until the global error cannot be reduced anymore. This model is called the Competence Model (CM). CM and PM are trained and tested with acoustic data and their corresponding visual parameters (defined as the vertical and the horizontal lip widths and as the lip-opening area parameters) for the audio-visual speech recognition of the 10 French vowels in adverse conditions. In both cases, we note the recognition rate and analyze the complementarity between the visual and the auditory information in terms of number of hidden units (which are connected either to the visual or to the auditory inputs vs Signal To Noise Ratio (SNR)) and in terms of the tuning of the attentional parameters vs SNR.

  14. Reference Manual for the System Advisor Model's Wind Power Performance Model

    SciTech Connect

    Freeman, J.; Jorgenson, J.; Gilman, P.; Ferguson, T.

    2014-08-01

    This manual describes the National Renewable Energy Laboratory's System Advisor Model (SAM) wind power performance model. The model calculates the hourly electrical output of a single wind turbine or of a wind farm. The wind power performance model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs. In SAM, the performance model can be coupled to one of the financial models to calculate economic metrics for residential, commercial, or utility-scale wind projects. This manual describes the algorithms used by the wind power performance model, which is available in the SAM user interface and as part of the SAM Simulation Core (SSC) library, and is intended to supplement the user documentation that comes with the software.

  15. Modelling and design of high performance indium phosphide solar cells

    NASA Technical Reports Server (NTRS)

    Rhoads, Sandra L.; Barnett, Allen M.

    1989-01-01

    A first principles pn junction device model has predicted new designs for high voltage, high efficiency InP solar cells. Measured InP material properties were applied and device parameters (thicknesses and doping) were adjusted to obtain optimal performance designs. Results indicate that p/n InP designs will provide higher voltages and higher energy conversion efficiencies than n/p structures. Improvements to n/p structures for increased efficiency are predicted. These new designs exploit the high absorption capabilities, relatively long diffusion lengths, and modest surface recombination velocities characteristic of InP. Predictions of performance indicate achievable open-circuit voltage values as high as 943 mV for InP and a practical maximum AM0 efficiency of 22.5 percent at 1 sun and 27 C. The details of the model, the optimal InP structure and the effect of individual parameter variations on device performance are presented.

  16. Human task animation from performance models and natural language input

    NASA Technical Reports Server (NTRS)

    Esakov, Jeffrey; Badler, Norman I.; Jung, Moon

    1989-01-01

    Graphical manipulation of human figures is essential for certain types of human factors analyses such as reach, clearance, fit, and view. In many situations, however, the animation of simulated people performing various tasks may be based on more complicated functions involving multiple simultaneous reaches, critical timing, resource availability, and human performance capabilities. One rather effective means for creating such a simulation is through a natural language description of the tasks to be carried out. Given an anthropometrically-sized figure and a geometric workplace environment, various simple actions such as reach, turn, and view can be effectively controlled from language commands or standard NASA checklist procedures. The commands may also be generated by external simulation tools. Task timing is determined from actual performance models, if available, such as strength models or Fitts' Law. The resulting action specification are animated on a Silicon Graphics Iris workstation in real-time.

  17. A Programming Model Performance Study Using the NAS Parallel Benchmarks

    DOE PAGESBeta

    Shan, Hongzhang; Blagojević, Filip; Min, Seung-Jai; Hargrove, Paul; Jin, Haoqiang; Fuerlinger, Karl; Koniges, Alice; Wright, Nicholas J.

    2010-01-01

    Harnessing the power of multicore platforms is challenging due to the additional levels of parallelism present. In this paper we use the NAS Parallel Benchmarks to study three programming models, MPI, OpenMP and PGAS to understand their performance and memory usage characteristics on current multicore architectures. To understand these characteristics we use the Integrated Performance Monitoring tool and other ways to measure communication versus computation time, as well as the fraction of the run time spent in OpenMP. The benchmarks are run on two different Cray XT5 systems and an Infiniband cluster. Our results show that in general the threemore » programming models exhibit very similar performance characteristics. In a few cases, OpenMP is significantly faster because it explicitly avoids communication. For these particular cases, we were able to re-write the UPC versions and achieve equal performance to OpenMP. Using OpenMP was also the most advantageous in terms of memory usage. Also we compare performance differences between the two Cray systems, which have quad-core and hex-core processors. We show that at scale the performance is almost always slower on the hex-core system because of increased contention for network resources.« less

  18. Challenges in modeling the X-29 flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    Presented are methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. However, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  19. Challenges in modeling the X-29A flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    The paper presents the methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. Despite these obstacles, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete the performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  20. Condition Assessment Model for Underground Water Mains Performance

    NASA Astrophysics Data System (ADS)

    Popawala, Reena; Shah, N. C.

    2014-12-01

    One of the greatest challenges facing by municipal engineers all over the world is condition assessment of underground water mains performance. As water mains are buried infrastructure, operated under pressure and most important it is inaccessible. In this situation, condition assessment of water mains is challenging as well as become mandatory to employ management strategies for these assets. This work presents the result of condition assessment of water mains performance by analytical hierarchy process modeling. The model was developed for data collected (2006-2011) from south-west zone of Surat city, India. The physical, operational and environmental main factors and 10 sub factors were selected to assess performance of water mains. The pair wise comparison matrices have been generated to make relative weights of each factor and their sub-factors. It represents relative importance of these factors among other factors. The pipe age is found highest relative contributing factor and pipe thickness is least relative contributing factor. The developed model generate results in terms of pipes condition on numeric scale, which is compared with linguistic scale and pipe condition is found like very risky, risky, adequate, good or excellent. The developed model is validated, which shows robust results of 86.4 %, the average validity percentage. Hence, the model is expected to benefit practitioners to prioritize rehabilitation or replacement planning for water mains.

  1. Performing efficient NURBS modeling operations on the GPU.

    PubMed

    Krishnamurthy, Adarsh; Khardekar, Rahul; McMains, Sara; Haller, Kirk; Elber, Gershon

    2009-01-01

    We present algorithms for evaluating and performing modeling operations on NURBS surfaces using the programmable fragment processor on the Graphics Processing Unit (GPU). We extend our GPU-based NURBS evaluator that evaluates NURBS surfaces to compute exact normals for either standard or rational B-spline surfaces for use in rendering and geometric modeling. We build on these calculations in our new GPU algorithms to perform standard modeling operations such as inverse evaluations, ray intersections, and surface-surface intersections on the GPU. Our modeling algorithms run in real time, enabling the user to sketch on the actual surface to create new features. In addition, the designer can edit the surface by interactively trimming it without the need for retessellation. Our GPU-accelerated algorithm to perform surface-surface intersection operations with NURBS surfaces can output intersection curves in the model space as well as in the parametric spaces of both the intersecting surfaces at interactive rates. We also extend our surface-surface intersection algorithm to evaluate self-intersections in NURBS surfaces. PMID:19423879

  2. Modeling-Error-Driven Performance-Seeking Direct Adaptive Control

    NASA Technical Reports Server (NTRS)

    Kulkarni, Nilesh V.; Kaneshige, John; Krishnakumar, Kalmanje; Burken, John

    2008-01-01

    This paper presents a stable discrete-time adaptive law that targets modeling errors in a direct adaptive control framework. The update law was developed in our previous work for the adaptive disturbance rejection application. The approach is based on the philosophy that without modeling errors, the original control design has been tuned to achieve the desired performance. The adaptive control should, therefore, work towards getting this performance even in the face of modeling uncertainties/errors. In this work, the baseline controller uses dynamic inversion with proportional-integral augmentation. Dynamic inversion is carried out using the assumed system model. On-line adaptation of this control law is achieved by providing a parameterized augmentation signal to the dynamic inversion block. The parameters of this augmentation signal are updated to achieve the nominal desired error dynamics. Contrary to the typical Lyapunov-based adaptive approaches that guarantee only stability, the current approach investigates conditions for stability as well as performance. A high-fidelity F-15 model is used to illustrate the overall approach.

  3. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.

  4. Cognition and procedure representational requirements for predictive human performance models

    NASA Technical Reports Server (NTRS)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  5. Performance and Prediction: Bayesian Modelling of Fallible Choice in Chess

    NASA Astrophysics Data System (ADS)

    Haworth, Guy; Regan, Ken; di Fatta, Giuseppe

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

  6. PHARAO laser source flight model: Design and performances

    NASA Astrophysics Data System (ADS)

    Lévèque, T.; Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P.; Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S.; Laurent, Ph.

    2015-03-01

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  7. A personality trait-based interactionist model of job performance.

    PubMed

    Tett, Robert P; Burnett, Dawn D

    2003-06-01

    Evidence for situational specificity of personality-job performance relations calls for better understanding of how personality is expressed as valued work behavior. On the basis of an interactionist principle of trait activation (R. P. Tett & H. A. Guterman, 2000), a model is proposed that distinguishes among 5 situational features relevant to trait expression (job demands, distracters, constraints, releasers, and facilitators), operating at task, social, and organizational levels. Trait-expressive work behavior is distinguished from (valued) job performance in clarifying the conditions favoring personality use in selection efforts. The model frames linkages between situational taxonomies (e.g., J. L. Holland's [1985] RIASEC model) and the Big Five and promotes useful discussion of critical issues, including situational specificity, personality-oriented job analysis, team building, and work motivation. PMID:12814298

  8. PHARAO laser source flight model: Design and performances

    SciTech Connect

    Lévèque, T. Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P.; Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S.; Laurent, Ph.

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  9. A measurement-based performability model for a multiprocessor system

    NASA Technical Reports Server (NTRS)

    Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.

    1987-01-01

    A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.

  10. A New Model to Simulate Energy Performance of VRF Systems

    SciTech Connect

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  11. Life Cycle Model for IT Performance Measurement: A Reference Model for Small and Medium Enterprises (SME)

    NASA Astrophysics Data System (ADS)

    Albayrak, Can Adam; Gadatsch, Andreas; Olufs, Dirk

    IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).

  12. Performance Measurement, Visualization and Modeling of Parallel and Distributed Programs

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar R.; Mehra, Pankaj; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper presents a methodology for debugging the performance of message-passing programs on both tightly coupled and loosely coupled distributed-memory machines. The AIMS (Automated Instrumentation and Monitoring System) toolkit, a suite of software tools for measurement and analysis of performance, is introduced and its application illustrated using several benchmark programs drawn from the field of computational fluid dynamics. AIMS includes (i) Xinstrument, a powerful source-code instrumentor, which supports both Fortran77 and C as well as a number of different message-passing libraries including Intel's NX Thinking Machines' CMMD, and PVM; (ii) Monitor, a library of timestamping and trace -collection routines that run on supercomputers (such as Intel's iPSC/860, Delta, and Paragon and Thinking Machines' CM5) as well as on networks of workstations (including Convex Cluster and SparcStations connected by a LAN); (iii) Visualization Kernel, a trace-animation facility that supports source-code clickback, simultaneous visualization of computation and communication patterns, as well as analysis of data movements; (iv) Statistics Kernel, an advanced profiling facility, that associates a variety of performance data with various syntactic components of a parallel program; (v) Index Kernel, a diagnostic tool that helps pinpoint performance bottlenecks through the use of abstract indices; (vi) Modeling Kernel, a facility for automated modeling of message-passing programs that supports both simulation -based and analytical approaches to performance prediction and scalability analysis; (vii) Intrusion Compensator, a utility for recovering true performance from observed performance by removing the overheads of monitoring and their effects on the communication pattern of the program; and (viii) Compatibility Tools, that convert AIMS-generated traces into formats used by other performance-visualization tools, such as ParaGraph, Pablo, and certain AVS/Explorer modules.

  13. The acceptability of ending a patient's life

    PubMed Central

    Guedj, M; Gibert, M; Maudet, A; Munoz, S; Mullet, E; Sorum, P

    2005-01-01

    Objectives: To clarify how lay people and health professionals judge the acceptability of ending the life of a terminally ill patient. Design: Participants judged this acceptability in a set of 16 scenarios that combined four factors: the identity of the actor (patient or physician), the patient's statement or not of a desire to have his life ended, the nature of the action as relatively active (injecting a toxin) or passive (disconnecting life support), and the type of suffering (intractable physical pain, complete dependence, or severe psychiatric illness). Participants: 115 lay people and 72 health professionals (22 nurse's aides, 44 nurses, six physicians) in Toulouse, France. Main measurements: Mean acceptability ratings for each scenario for each group. Results: Life ending interventions are more acceptable to lay people than to the health professionals. For both, acceptability is highest for intractable physical suffering; is higher when patients end their own lives than when physicians do so; and, when physicians are the actors, is higher when patients have expressed a desire to die (voluntary euthanasia) than when they have not (involuntary euthanasia). In contrast, when patients perform the action, acceptability for the lay people and nurse's aides does not depend on whether the patient has expressed a desire to die, while for the nurses and physicians unassisted suicide is more acceptable than physician assisted suicide. Conclusions: Lay participants judge the acceptability of life ending actions in largely the same way as do healthcare professionals. PMID:15923476

  14. Modeling boost performance using a two dimensional implementation of the targeting task performance metric

    NASA Astrophysics Data System (ADS)

    Preece, Bradley L.; Haefner, David P.; Fanning, Jonathan D.

    2012-06-01

    Using post-processing filters to enhance image detail, a process commonly referred to as boost, can significantly affect the performance of an EO/IR system. The US Army's target acquisition models currently use the Targeting Task Performance (TTP) metric to quantify sensor performance. The TTP metric accounts for each element in the system including: blur and noise introduced by the imager, any additional post-processing steps, and the effects of the Human Visual System (HVS). The current implementation of the TTP metric assumes spatial separability, which can introduce significant errors when the TTP is applied to systems using non-separable filters. To accurately apply the TTP metric to systems incorporating boost, we have implement a two-dimensional (2D) version of the TTP metric. The accuracy of the 2D TTP metric was verified through a series of perception experiments involving various levels of boost. The 2D TTP metric has been incorporated into the Night Vision Integrated Performance Model (NV-IPM) allowing accurate system modeling of non-separable image filters.

  15. Determining Performance Acceptability of Electrochemical Oxygen Sensors

    NASA Technical Reports Server (NTRS)

    Gonzales, Daniel

    2012-01-01

    A method has been developed to screen commercial electrochemical oxygen sensors to reduce the failure rate. There are three aspects to the method: First, the sensitivity over time (several days) can be measured and the rate of change of the sensitivity can be used to predict sensor failure. Second, an improvement to this method would be to store the sensors in an oxygen-free (e.g., nitrogen) environment and intermittently measure the sensitivity over time (several days) to accomplish the same result while preserving the sensor lifetime by limiting consumption of the electrode. Third, the second time derivative of the sensor response over time can be used to determine the point in time at which the sensors are sufficiently stable for use.

  16. Numerical Modeling of Pulse Detonation Rocket Engine Gasdynamics And Performance

    NASA Technical Reports Server (NTRS)

    Morris, Christopher I.

    2004-01-01

    Pulse detonation rocket engines (PDREs) offer potential performance improvements over conventional designs, but represent a challenging modeling task. A quasi-1-D, finite-rate chemistry computational fluid dynamics model for PDREs is described and implemented. Four different PDRE geometries are evaluated in this work: a baseline detonation tube, a detonation tube with a straight extension, and a detonation tube with two types of converging-diverging (C-D) nozzles. The effect of extension length and C-D nozzle area ratio on the single-shot gasdynamics and performance of a PDRE is studied over a wide range of blowdown pressure ratios (1-1000). The results indicate that a C-D nozzle is generally more effective than a straight extension in improving PDRE performance, particularly at higher pressure ratios. Additionally, the results show that the blowdown process of the C-D nozzle systems could be beneficially cut off well before the pressure at the end-wall reaches the ambient value. The performance results are also compared to a steady-state rocket system using similar modeling assumptions.

  17. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  18. Acceptability of human risk.

    PubMed Central

    Kasperson, R E

    1983-01-01

    This paper has three objectives: to explore the nature of the problem implicit in the term "risk acceptability," to examine the possible contributions of scientific information to risk standard-setting, and to argue that societal response is best guided by considerations of process rather than formal methods of analysis. Most technological risks are not accepted but are imposed. There is also little reason to expect consensus among individuals on their tolerance of risk. Moreover, debates about risk levels are often at base debates over the adequacy of the institutions which manage the risks. Scientific information can contribute three broad types of analyses to risk-setting deliberations: contextual analysis, equity assessment, and public preference analysis. More effective risk-setting decisions will involve attention to the process used, particularly in regard to the requirements of procedural justice and democratic responsibility. PMID:6418541

  19. The Predictive Performance and Stability of Six Species Distribution Models

    PubMed Central

    Huang, Min-Yi; Fan, Wei-Yi; Wang, Zhi-Gao

    2014-01-01

    Background Predicting species’ potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs. Methodology We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values. Results The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05), while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05), and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points). Conclusions According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important

  20. Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.