Sample records for validating theoretical models

  1. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  2. The Safety Culture Enactment Questionnaire (SCEQ): Theoretical model and empirical validation.

    PubMed

    de Castro, Borja López; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2017-06-01

    This paper presents the Safety Culture Enactment Questionnaire (SCEQ), designed to assess the degree to which safety is an enacted value in the day-to-day running of nuclear power plants (NPPs). The SCEQ is based on a theoretical safety culture model that is manifested in three fundamental components of the functioning and operation of any organization: strategic decisions, human resources practices, and daily activities and behaviors. The extent to which the importance of safety is enacted in each of these three components provides information about the pervasiveness of the safety culture in the NPP. To validate the SCEQ and the model on which it is based, two separate studies were carried out with data collection in 2008 and 2014, respectively. In Study 1, the SCEQ was administered to the employees of two Spanish NPPs (N=533) belonging to the same company. Participants in Study 2 included 598 employees from the same NPPs, who completed the SCEQ and other questionnaires measuring different safety outcomes (safety climate, safety satisfaction, job satisfaction and risky behaviors). Study 1 comprised item formulation and examination of the factorial structure and reliability of the SCEQ. Study 2 tested internal consistency and provided evidence of factorial validity, validity based on relationships with other variables, and discriminant validity between the SCEQ and safety climate. Exploratory Factor Analysis (EFA) carried out in Study 1 revealed a three-factor solution corresponding to the three components of the theoretical model. Reliability analyses showed strong internal consistency for the three scales of the SCEQ, and each of the 21 items on the questionnaire contributed to the homogeneity of its theoretically developed scale. Confirmatory Factor Analysis (CFA) carried out in Study 2 supported the internal structure of the SCEQ; internal consistency of the scales was also supported. Furthermore, the three scales of the SCEQ showed the expected correlation

  3. Establishment and validation for the theoretical model of the vehicle airbag

    NASA Astrophysics Data System (ADS)

    Zhang, Junyuan; Jin, Yang; Xie, Lizhe; Chen, Chao

    2015-05-01

    The current design and optimization of the occupant restraint system (ORS) are based on numerous actual tests and mathematic simulations. These two methods are overly time-consuming and complex for the concept design phase of the ORS, though they're quite effective and accurate. Therefore, a fast and directive method of the design and optimization is needed in the concept design phase of the ORS. Since the airbag system is a crucial part of the ORS, in this paper, a theoretical model for the vehicle airbag is established in order to clarify the interaction between occupants and airbags, and further a fast design and optimization method of airbags in the concept design phase is made based on the proposed theoretical model. First, the theoretical expression of the simplified mechanical relationship between the airbag's design parameters and the occupant response is developed based on classical mechanics, then the momentum theorem and the ideal gas state equation are adopted to illustrate the relationship between airbag's design parameters and occupant response. By using MATLAB software, the iterative algorithm method and discrete variables are applied to the solution of the proposed theoretical model with a random input in a certain scope. And validations by MADYMO software prove the validity and accuracy of this theoretical model in two principal design parameters, the inflated gas mass and vent diameter, within a regular range. This research contributes to a deeper comprehension of the relation between occupants and airbags, further a fast design and optimization method for airbags' principal parameters in the concept design phase, and provides the range of the airbag's initial design parameters for the subsequent CAE simulations and actual tests.

  4. Theoretical modeling and experimental validation of a torsional piezoelectric vibration energy harvesting system

    NASA Astrophysics Data System (ADS)

    Qian, Feng; Zhou, Wanlu; Kaluvan, Suresh; Zhang, Haifeng; Zuo, Lei

    2018-04-01

    Vibration energy harvesting has been extensively studied in recent years to explore a continuous power source for sensor networks and low-power electronics. Torsional vibration widely exists in mechanical engineering; however, it has not yet been well exploited for energy harvesting. This paper presents a theoretical model and an experimental validation of a torsional vibration energy harvesting system comprised of a shaft and a shear mode piezoelectric transducer. The piezoelectric transducer position on the surface of the shaft is parameterized by two variables that are optimized to obtain the maximum power output. The piezoelectric transducer can work in d 15 mode (pure shear mode), coupled mode of d 31 and d 33, and coupled mode of d 33, d 31 and d 15, respectively, when attached at different angles. Approximate expressions of voltage and power are derived from the theoretical model, which gave predictions in good agreement with analytical solutions. Physical interpretations on the implicit relationship between the power output and the position parameters of the piezoelectric transducer is given based on the derived approximate expression. The optimal position and angle of the piezoelectric transducer is determined, in which case, the transducer works in the coupled mode of d 15, d 31 and d 33.

  5. Vaporization dynamics of volatile perfluorocarbon droplets: A theoretical model and in vitro validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doinikov, Alexander A., E-mail: doinikov@bsu.by; Bouakaz, Ayache; Sheeran, Paul S.

    2014-10-15

    Purpose: Perfluorocarbon (PFC) microdroplets, called phase-change contrast agents (PCCAs), are a promising tool in ultrasound imaging and therapy. Interest in PCCAs is motivated by the fact that they can be triggered to transition from the liquid state to the gas state by an externally applied acoustic pulse. This property opens up new approaches to applications in ultrasound medicine. Insight into the physics of vaporization of PFC droplets is vital for effective use of PCCAs and for anticipating bioeffects. PCCAs composed of volatile PFCs (with low boiling point) exhibit complex dynamic behavior: after vaporization by a short acoustic pulse, a PFCmore » droplet turns into a vapor bubble which undergoes overexpansion and damped radial oscillation until settling to a final diameter. This behavior has not been well described theoretically so far. The purpose of our study is to develop an improved theoretical model that describes the vaporization dynamics of volatile PFC droplets and to validate this model by comparison with in vitro experimental data. Methods: The derivation of the model is based on applying the mathematical methods of fluid dynamics and thermodynamics to the process of the acoustic vaporization of PFC droplets. The used approach corrects shortcomings of the existing models. The validation of the model is carried out by comparing simulated results with in vitro experimental data acquired by ultrahigh speed video microscopy for octafluoropropane (OFP) and decafluorobutane (DFB) microdroplets of different sizes. Results: The developed theory allows one to simulate the growth of a vapor bubble inside a PFC droplet until the liquid PFC is completely converted into vapor, and the subsequent overexpansion and damped oscillations of the vapor bubble, including the influence of an externally applied acoustic pulse. To evaluate quantitatively the difference between simulated and experimental results, the L2-norm errors were calculated for all cases where

  6. Experimental validation of a theoretical model of dual wavelength photoacoustic (PA) excitation in fluorophores

    NASA Astrophysics Data System (ADS)

    Märk, Julia; Theiss, Christoph; Schmitt, Franz-Josef; Laufer, Jan

    2015-03-01

    Fluorophores, such as exogenous dyes and genetically expressed proteins, exhibit radiative relaxation with long excited state lifetimes. This can be exploited for PA detection based on dual wavelength excitation using pump and probe wavelengths that coincide with the absorption and emission spectra, respectively. While the pump pulse raises the fluorophore to a long-lived excited state, simultaneous illumination with the probe pulse reduces the excited state lifetime due to stimulated emission (SE).This leads to a change in thermalized energy, and hence PA signal amplitude, compared to single wavelength illumination. By introducing a time delay between pump and probe pulses, the change in PA amplitude can be modulated. Since the effect is not observed in endogenous chromophores, it provides a contrast mechanism for the detection of fluorophores via PA difference imaging. In this study, a theoretical model of the PA signal generation in fluorophores was developed and experimentally validated. The model is based on a system of coupled rate equations, which describe the spatial and temporal changes in the population of the molecular energy levels of a fluorophore as a function of pump-probe energy and concentration. This allows the prediction of the thermalized energy distribution, and hence the time-resolved PA signal amplitude. The model was validated by comparing its predictions to PA signals measured in solutions of rhodamine 6G, a well-known laser dye, and Atto680, a NIR fluorophore.

  7. Intrasubject Predictions of Vocational Preference: Convergent Validation via the Decision Theoretic Paradigm.

    ERIC Educational Resources Information Center

    Monahan, Carlyn J.; Muchinsky, Paul M.

    1985-01-01

    The degree of convergent validity among four methods of identifying vocational preferences is assessed via the decision theoretic paradigm. Vocational preferences identified by Holland's Vocational Preference Inventory (VPI), a rating procedure, and ranking were compared with preferences identified from a policy-capturing model developed from an…

  8. Hybrid rocket engine, theoretical model and experiment

    NASA Astrophysics Data System (ADS)

    Chelaru, Teodor-Viorel; Mingireanu, Florin

    2011-06-01

    The purpose of this paper is to build a theoretical model for the hybrid rocket engine/motor and to validate it using experimental results. The work approaches the main problems of the hybrid motor: the scalability, the stability/controllability of the operating parameters and the increasing of the solid fuel regression rate. At first, we focus on theoretical models for hybrid rocket motor and compare the results with already available experimental data from various research groups. A primary computation model is presented together with results from a numerical algorithm based on a computational model. We present theoretical predictions for several commercial hybrid rocket motors, having different scales and compare them with experimental measurements of those hybrid rocket motors. Next the paper focuses on tribrid rocket motor concept, which by supplementary liquid fuel injection can improve the thrust controllability. A complementary computation model is also presented to estimate regression rate increase of solid fuel doped with oxidizer. Finally, the stability of the hybrid rocket motor is investigated using Liapunov theory. Stability coefficients obtained are dependent on burning parameters while the stability and command matrixes are identified. The paper presents thoroughly the input data of the model, which ensures the reproducibility of the numerical results by independent researchers.

  9. A simple theoretical model for ⁶³Ni betavoltaic battery.

    PubMed

    Zuo, Guoping; Zhou, Jianliang; Ke, Guotu

    2013-12-01

    A numerical simulation of the energy deposition distribution in semiconductors is performed for ⁶³Ni beta particles. Results show that the energy deposition distribution exhibits an approximate exponential decay law. A simple theoretical model is developed for ⁶³Ni betavoltaic battery based on the distribution characteristics. The correctness of the model is validated by two literature experiments. Results show that the theoretical short-circuit current agrees well with the experimental results, and the open-circuit voltage deviates from the experimental results in terms of the influence of the PN junction defects and the simplification of the source. The theoretical model can be applied to ⁶³Ni and ¹⁴⁷Pm betavoltaic batteries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. External Validity in the Study of Human Development: Theoretical and Methodological Issues

    ERIC Educational Resources Information Center

    Hultsch, David F.; Hickey, Tom

    1978-01-01

    An examination of the concept of external validity from two theoretical perspectives: a traditional mechanistic approach and a dialectical organismic approach. Examines the theoretical and methodological implications of these perspectives. (BD)

  11. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    PubMed

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  12. Developing rural palliative care: validating a conceptual model.

    PubMed

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  13. Validation of the theoretical domains framework for use in behaviour change and implementation research

    PubMed Central

    2012-01-01

    Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986

  14. Theoretical modeling of indoor radon concentration and its validation through measurements in South-East Haryana, India.

    PubMed

    Singh, Prabhjot; Sahoo, B K; Bajwa, B S

    2016-04-15

    A three dimensional semi-empirical model deduced from the existing 1-D model has been used to predict indoor radon concentration with theoretical calculations. Since the major contributor of radon concentration in indoors originates from building materials used in construction of walls and floor which are mostly derived from soil. In this study different building materials have been analyzed for radon exhalation, diffusion length along with physical dimensions of observation area to calculate indoor radon concentration. Also calculated values have been validated by comparing with experimental measurements. The study has been carried out in the mud, brick and cement houses constructed from materials available locally in South-East region of Haryana. This region is also known for its protruding land structure consisting volcanic, felsite and granitic rocks in plane. Further, exhalation (Jw) ratio from wall and floor comparison has been plotted for each selected village dwelling to identify the high radon emanating source (building material) from the study region. All those measured factors might be useful in building construction code development and selection of material to be used in construction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  16. Introduction to Theoretical Modelling

    NASA Astrophysics Data System (ADS)

    Davis, Matthew J.; Gardiner, Simon A.; Hanna, Thomas M.; Nygaard, Nicolai; Proukakis, Nick P.; Szymańska, Marzena H.

    2013-02-01

    We briefly overview commonly encountered theoretical notions arising in the modelling of quantum gases, intended to provide a unified background to the `language' and diverse theoretical models presented elsewhere in this book, and aimed particularly at researchers from outside the quantum gases community.

  17. NMR relaxation induced by iron oxide particles: testing theoretical models.

    PubMed

    Gossuin, Y; Orlando, T; Basini, M; Henrard, D; Lascialfari, A; Mattea, C; Stapf, S; Vuong, Q L

    2016-04-15

    Superparamagnetic iron oxide particles find their main application as contrast agents for cellular and molecular magnetic resonance imaging. The contrast they bring is due to the shortening of the transverse relaxation time T 2 of water protons. In order to understand their influence on proton relaxation, different theoretical relaxation models have been developed, each of them presenting a certain validity domain, which depends on the particle characteristics and proton dynamics. The validation of these models is crucial since they allow for predicting the ideal particle characteristics for obtaining the best contrast but also because the fitting of T 1 experimental data by the theory constitutes an interesting tool for the characterization of the nanoparticles. In this work, T 2 of suspensions of iron oxide particles in different solvents and at different temperatures, corresponding to different proton diffusion properties, were measured and were compared to the three main theoretical models (the motional averaging regime, the static dephasing regime, and the partial refocusing model) with good qualitative agreement. However, a real quantitative agreement was not observed, probably because of the complexity of these nanoparticulate systems. The Roch theory, developed in the motional averaging regime (MAR), was also successfully used to fit T 1 nuclear magnetic relaxation dispersion (NMRD) profiles, even outside the MAR validity range, and provided a good estimate of the particle size. On the other hand, the simultaneous fitting of T 1 and T 2 NMRD profiles by the theory was impossible, and this occurrence constitutes a clear limitation of the Roch model. Finally, the theory was shown to satisfactorily fit the deuterium T 1 NMRD profile of superparamagnetic particle suspensions in heavy water.

  18. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  19. Differential Validation of a Path Analytic Model of University Dropout.

    ERIC Educational Resources Information Center

    Winteler, Adolf

    Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…

  20. Validation of theoretical models of intrinsic torque in DIII-D

    NASA Astrophysics Data System (ADS)

    Grierson, B. A.; Wang, W. X.; Battaglia, D. J.; Chrystal, C.; Solomon, W. M.; Degrassie, J. S.; Staebler, G. M.; Boedo, J. A.

    2016-10-01

    Plasma rotation experiments in DIII-D are validating models of main-ion intrinsic rotation by testing Reynolds stress induced toroidal flow in the plasma core and intrinsic rotation induced by ion orbit losses in the plasma edge. In the core of dominantly electron heated plasmas with Te=Ti, the main-ion intrinsic toroidal rotation undergoes a reversal that correlates with the critical gradient for ITG turbulence. Residual stress arising from zonal-flow ExB shear and turbulence intensity gradient produce residual stress and counter-current intrinsic torque, which is balanced by momentum diffusion, creating the hollow profile. Quantitative agreement is obtained for the first time between the measured main-ion toroidal rotation and the rotation profile predicted by nonlinear GTS gyrokinetic simulations. At the plasma boundary, new main-ion CER measurements show a co-current rotation layer and this is tested against ion orbit loss models as the source of bulk plasma rotation. Work supported by the US Department of Energy under DE-AC02-09CH11466 and DE-FC02-04ER54698.

  1. A theoretical model of the application of RF energy to the airway wall and its experimental validation.

    PubMed

    Jarrard, Jerry; Wizeman, Bill; Brown, Robert H; Mitzner, Wayne

    2010-11-27

    Bronchial thermoplasty is a novel technique designed to reduce an airway's ability to contract by reducing the amount of airway smooth muscle through controlled heating of the airway wall. This method has been examined in animal models and as a treatment for asthma in human subjects. At the present time, there has been little research published about how radiofrequency (RF) energy and heat is transferred to the airways of the lung during bronchial thermoplasty procedures. In this manuscript we describe a computational, theoretical model of the delivery of RF energy to the airway wall. An electro-thermal finite-element-analysis model was designed to simulate the delivery of temperature controlled RF energy to airway walls of the in vivo lung. The model includes predictions of heat generation due to RF joule heating and transfer of heat within an airway wall due to thermal conduction. To implement the model, we use known physical characteristics and dimensions of the airway and lung tissues. The model predictions were tested with measurements of temperature, impedance, energy, and power in an experimental canine model. Model predictions of electrode temperature, voltage, and current, along with tissue impedance and delivered energy were compared to experiment measurements and were within ± 5% of experimental averages taken over 157 sample activations.The experimental results show remarkable agreement with the model predictions, and thus validate the use of this model to predict the heat generation and transfer within the airway wall following bronchial thermoplasty. The model also demonstrated the importance of evaporation as a loss term that affected both electrical measurements and heat distribution. The model predictions showed excellent agreement with the empirical results, and thus support using the model to develop the next generation of devices for bronchial thermoplasty. Our results suggest that comparing model results to RF generator electrical measurements

  2. Theoretical Development, Factorial Validity, and Reliability of the Online Graduate Mentoring Scale

    ERIC Educational Resources Information Center

    Crawford, Linda M.; Randolph, Justus J.; Yob, Iris M.

    2014-01-01

    In this study, we sought to confirm the theoretical framework underlying an Online Graduate Mentoring Scale by establishing the scale's factorial validity and reliability. Analysis of data received from doctoral students and alumni/ae of the College of Education of one large, online, accredited university reduced the initial theoretical…

  3. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    PubMed

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Groundwater Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of

  5. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  6. A comparative study of theoretical graph models for characterizing structural networks of human brain.

    PubMed

    Li, Xiaojin; Hu, Xintao; Jin, Changfeng; Han, Junwei; Liu, Tianming; Guo, Lei; Hao, Wei; Li, Lingjiang

    2013-01-01

    Previous studies have investigated both structural and functional brain networks via graph-theoretical methods. However, there is an important issue that has not been adequately discussed before: what is the optimal theoretical graph model for describing the structural networks of human brain? In this paper, we perform a comparative study to address this problem. Firstly, large-scale cortical regions of interest (ROIs) are localized by recently developed and validated brain reference system named Dense Individualized Common Connectivity-based Cortical Landmarks (DICCCOL) to address the limitations in the identification of the brain network ROIs in previous studies. Then, we construct structural brain networks based on diffusion tensor imaging (DTI) data. Afterwards, the global and local graph properties of the constructed structural brain networks are measured using the state-of-the-art graph analysis algorithms and tools and are further compared with seven popular theoretical graph models. In addition, we compare the topological properties between two graph models, namely, stickiness-index-based model (STICKY) and scale-free gene duplication model (SF-GD), that have higher similarity with the real structural brain networks in terms of global and local graph properties. Our experimental results suggest that among the seven theoretical graph models compared in this study, STICKY and SF-GD models have better performances in characterizing the structural human brain network.

  7. Validating a Model of Effective Teaching Behaviour of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Maulana, Ridwan; Helms-Lorenz, Michelle; Van de Grift, Wim

    2017-01-01

    Although effective teaching behaviour is central for pupil outcomes, the extent to which pre-service teachers behave effectively in the classroom and how their behaviour relates to pupils' engagement remain unanswered. The present study aims to validate a theoretical model linking effective pre-service teaching behaviour and pupil's engagement,…

  8. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural

  9. Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model

    PubMed Central

    CULLEY, JOAN M.

    2012-01-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283

  10. Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.

    PubMed

    Culley, Joan M

    2011-05-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.

  11. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Treesearch

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  12. Development and validation of a mass casualty conceptual model.

    PubMed

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  13. A theoretical model of co-worker responses to work reintegration processes.

    PubMed

    Dunstan, Debra A; Maceachen, Ellen

    2014-06-01

    Emerging research has shown that co-workers have a significant influence on the return-to-work outcomes of partially fit ill or injured employees. By drawing on theoretical findings from the human resource and wider behavioral sciences literatures, our goal was to formulate a theoretical model of the influences on and outcomes of co-worker responses within work reintegration. From a search of 15 data bases covering the social sciences, business and medicine, we identified articles containing models of the factors that influence co-workers' responses to disability accommodations; and, the nature and impact of co-workers' behaviors on employee outcomes. To meet our goal, we combined identified models to form a comprehensive model of the relevant factors and relationships. Internal consistency and externally validity were assessed. The combined model illustrates four key findings: (1) co-workers' behaviors towards an accommodated employee are influenced by attributes of that employee, the illness or injury, the co-worker themselves, and the work environment; (2) the influences-behaviour relationship is mediated by perceptions of the fairness of the accommodation; (3) co-workers' behaviors affect all work reintegration outcomes; and (4) co-workers' behaviours can vary from support to antagonism and are moderated by type of support required, the social intensity of the job, and the level of antagonism. Theoretical models from the wider literature are useful for understanding the impact of co-workers on the work reintegration process. To achieve optimal outcomes, co-workers need to perceive the arrangements as fair. Perceptions of fairness might be supported by co-workers' collaborative engagement in the planning, monitoring and review of work reintegration activities.

  14. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no

  15. Subjective evaluation and electroacoustic theoretical validation of a new approach to audio upmixing

    NASA Astrophysics Data System (ADS)

    Usher, John S.

    Audio signal processing systems for converting two-channel (stereo) recordings to four or five channels are increasingly relevant. These audio upmixers can be used with conventional stereo sound recordings and reproduced with multichannel home theatre or automotive loudspeaker audio systems to create a more engaging and natural-sounding listening experience. This dissertation discusses existing approaches to audio upmixing for recordings of musical performances and presents specific design criteria for a system to enhance spatial sound quality. A new upmixing system is proposed and evaluated according to these criteria and a theoretical model for its behavior is validated using empirical measurements. The new system removes short-term correlated components from two electronic audio signals using a pair of adaptive filters, updated according to a frequency domain implementation of the normalized-least-means-square algorithm. The major difference of the new system with all extant audio upmixers is that unsupervised time-alignment of the input signals (typically, by up to +/-10 ms) as a function of frequency (typically, using a 1024-band equalizer) is accomplished due to the non-minimum phase adaptive filter. Two new signals are created from the weighted difference of the inputs, and are then radiated with two loudspeakers behind the listener. According to the consensus in the literature on the effect of interaural correlation on auditory image formation, the self-orthogonalizing properties of the algorithm ensure minimal distortion of the frontal source imagery and natural-sounding, enveloping reverberance (ambiance) imagery. Performance evaluation of the new upmix system was accomplished in two ways: Firstly, using empirical electroacoustic measurements which validate a theoretical model of the system; and secondly, with formal listening tests which investigated auditory spatial imagery with a graphical mapping tool and a preference experiment. Both electroacoustic

  16. Theoretical models of parental HIV disclosure: a critical review.

    PubMed

    Qiao, Shan; Li, Xiaoming; Stanton, Bonita

    2013-01-01

    This study critically examined three major theoretical models related to parental HIV disclosure (i.e., the Four-Phase Model [FPM], the Disclosure Decision Making Model [DDMM], and the Disclosure Process Model [DPM]), and the existing studies that could provide empirical support to these models or their components. For each model, we briefly reviewed its theoretical background, described its components and/or mechanisms, and discussed its strengths and limitations. The existing empirical studies supported most theoretical components in these models. However, hypotheses related to the mechanisms proposed in the models have not yet tested due to a lack of empirical evidence. This study also synthesized alternative theoretical perspectives and new issues in disclosure research and clinical practice that may challenge the existing models. The current study underscores the importance of including components related to social and cultural contexts in theoretical frameworks, and calls for more adequately designed empirical studies in order to test and refine existing theories and to develop new ones.

  17. Validating Experimental and Theoretical Langmuir Probe Analyses

    NASA Astrophysics Data System (ADS)

    Pilling, Lawrence Stuart; Carnegie, Dale

    2004-11-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a DC discharge plasma over a wide variety of conditions. This discharge contains a dual temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital motion limited (OML) is approximately the same as the radial motion gradients. An analysis of the gradients from the radial motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature. Only the position of the space charge potential is necessary to determine the applicable theory.

  18. Validating experimental and theoretical Langmuir probe analyses

    NASA Astrophysics Data System (ADS)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  19. Finite Element Vibration Modeling and Experimental Validation for an Aircraft Engine Casing

    NASA Astrophysics Data System (ADS)

    Rabbitt, Christopher

    This thesis presents a procedure for the development and validation of a theoretical vibration model, applies this procedure to a pair of aircraft engine casings, and compares select parameters from experimental testing of those casings to those from a theoretical model using the Modal Assurance Criterion (MAC) and linear regression coefficients. A novel method of determining the optimal MAC between axisymmetric results is developed and employed. It is concluded that the dynamic finite element models developed as part of this research are fully capable of modelling the modal parameters within the frequency range of interest. Confidence intervals calculated in this research for correlation coefficients provide important information regarding the reliability of predictions, and it is recommended that these intervals be calculated for all comparable coefficients. The procedure outlined for aligning mode shapes around an axis of symmetry proved useful, and the results are promising for the development of further optimization techniques.

  20. A preliminary theoretical line-blanketed model solar photosphere

    NASA Technical Reports Server (NTRS)

    Kurucz, R. L.

    1974-01-01

    In the theoretical approach to model-atmosphere construction, all opacities are computed theoretically and the temperature-pressure structure is determined by conservation of energy. Until recently, this has not been a very useful method for later type stars, because the line opacity was both poorly known and difficult to calculate. However, methods have now been developed that are capable of representing the line opacity well enough for construction of realistic models. A preliminary theoretical solar model is presented that produces closer agreement with observation than has been heretofore possible. The qualitative advantages and shortcomings of this model are discussued and projected improvements are outlined.

  1. Characterizing the In-Phase Reflection Bandwidth Theoretical Limit of Artificial Magnetic Conductors With a Transmission Line Model

    NASA Technical Reports Server (NTRS)

    Xie, Yunsong; Fan, Xin; Chen, Yunpeng; Wilson, Jeefrey D.; Simons, Rainee N.; Xiao, John Q.

    2013-01-01

    We validate through simulation and experiment that artificial magnetic conductors (AMC s) can be well characterized by a transmission line model. The theoretical bandwidth limit of the in-phase reflection can be expressed in terms of the effective RLC parameters from the surface patch and the properties of the substrate. It is found that the existence of effective inductive components will reduce the in-phase reflection bandwidth of the AMC. Furthermore, we propose design strategies to optimize AMC structures with an in-phase reflection bandwidth closer to the theoretical limit.

  2. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  3. Validation of a theoretically motivated approach to measuring childhood socioeconomic circumstances in the Health and Retirement Study.

    PubMed

    Vable, Anusha M; Gilsanz, Paola; Nguyen, Thu T; Kawachi, Ichiro; Glymour, M Maria

    2017-01-01

    Childhood socioeconomic status (cSES) is a powerful predictor of adult health, but its operationalization and measurement varies across studies. Using Health and Retirement Study data (HRS, which is nationally representative of community-residing United States adults aged 50+ years), we specified theoretically-motivated cSES measures, evaluated their reliability and validity, and compared their performance to other cSES indices. HRS respondent data (N = 31,169, interviewed 1992-2010) were used to construct a cSES index reflecting childhood social capital (cSC), childhood financial capital (cFC), and childhood human capital (cHC), using retrospective reports from when the respondent was <16 years (at least 34 years prior). We assessed internal consistency reliability (Cronbach's alpha) for the scales (cSC and cFC), and construct validity, and predictive validity for all measures. Validity was assessed with hypothesized correlates of cSES (educational attainment, measured adult height, self-reported childhood health, childhood learning problems, childhood drug and alcohol problems). We then compared the performance of our validated measures with other indices used in HRS in predicting self-rated health and number of depressive symptoms, measured in 2010. Internal consistency reliability was acceptable (cSC = 0.63, cFC = 0.61). Most measures were associated with hypothesized correlates (for example, the association between educational attainment and cSC was 0.01, p < 0.0001), with the exception that measured height was not associated with cFC (p = 0.19) and childhood drug and alcohol problems (p = 0.41), and childhood learning problems (p = 0.12) were not associated with cHC. Our measures explained slightly more variability in self-rated health (adjusted R2 = 0.07 vs. <0.06) and number of depressive symptoms (adjusted R2 > 0.05 vs. < 0.04) than alternative indices. Our cSES measures use latent variable models to handle item-missingness, thereby increasing the sample

  4. Experimental and theoretical study of magnetohydrodynamic ship models.

    PubMed

    Cébron, David; Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe

    2017-01-01

    Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques.

  5. Experimental and theoretical study of magnetohydrodynamic ship models

    PubMed Central

    Viroulet, Sylvain; Vidal, Jérémie; Masson, Jean-Paul; Viroulet, Philippe

    2017-01-01

    Magnetohydrodynamic (MHD) ships represent a clear demonstration of the Lorentz force in fluids, which explains the number of students practicals or exercises described on the web. However, the related literature is rather specific and no complete comparison between theory and typical small scale experiments is currently available. This work provides, in a self-consistent framework, a detailed presentation of the relevant theoretical equations for small MHD ships and experimental measurements for future benchmarks. Theoretical results of the literature are adapted to these simple battery/magnets powered ships moving on salt water. Comparison between theory and experiments are performed to validate each theoretical step such as the Tafel and the Kohlrausch laws, or the predicted ship speed. A successful agreement is obtained without any adjustable parameter. Finally, based on these results, an optimal design is then deduced from the theory. Therefore this work provides a solid theoretical and experimental ground for small scale MHD ships, by presenting in detail several approximations and how they affect the boat efficiency. Moreover, the theory is general enough to be adapted to other contexts, such as large scale ships or industrial flow measurement techniques. PMID:28665941

  6. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Subramaniam, D. Rajan; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2014-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800- F3900 fiber/resin composite material.

  7. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2015-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material

  8. Theoretical Model of Professional Competence Development in Dual-Specialty Students (On the Example of the "History, Religious Studies" Specialty)

    ERIC Educational Resources Information Center

    Karimova, A. E.; Amanova, A. S.; Sadykova, A. M.; Kuzembaev, N. E.; Makisheva, A. T.; Kurmangazina, G. Zh.; Sakenov, Janat

    2016-01-01

    The article explores the significant problem of developing a theoretical model of professional competence development in dual-specialty students (on the example of the "History, Religious studies" specialty). In order to validate the specifics of the professional competence development in dual-specialty students (on the example of the…

  9. Thermal conductivity of molten salt mixtures: Theoretical model supported by equilibrium molecular dynamics simulations.

    PubMed

    Gheribi, Aïmen E; Chartrand, Patrice

    2016-02-28

    A theoretical model for the description of thermal conductivity of molten salt mixtures as a function of composition and temperature is presented. The model is derived by considering the classical kinetic theory and requires, for its parametrization, only information on thermal conductivity of pure compounds. In this sense, the model is predictive. For most molten salt mixtures, no experimental data on thermal conductivity are available in the literature. This is a hindrance for many industrial applications (in particular for thermal energy storage technologies) as well as an obvious barrier for the validation of the theoretical model. To alleviate this lack of data, a series of equilibrium molecular dynamics (EMD) simulations has been performed on several molten chloride systems in order to determine their thermal conductivity in the entire range of composition at two different temperatures: 1200 K and 1300 K. The EMD simulations are first principles type, as the potentials used to describe the interactions have been parametrized on the basis of first principle electronic structure calculations. In addition to the molten chlorides system, the model predictions are also compared to a recent similar EMD study on molten fluorides and with the few reliable experimental data available in the literature. The accuracy of the proposed model is within the reported numerical and/or experimental errors.

  10. A Comprehensive Validation Methodology for Sparse Experimental Data

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Blattnig, Steve R.

    2010-01-01

    A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.

  11. Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence

    NASA Astrophysics Data System (ADS)

    Cerqueti, Roy; Fenga, Livio; Ventura, Marco

    2018-06-01

    This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.

  12. Validation of the Activities of Community Transportation model for individuals with cognitive impairments.

    PubMed

    Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Hung, Pei-Fang

    2009-01-01

    To develop a theoretical, functional model of community navigation for individuals with cognitive impairments: the Activities of Community Transportation (ACTs). Iterative design using qualitative methods (i.e. document review, focus groups and observations). Four agencies providing travel training to adults with cognitive impairments in the USA participated in the validation study. A thorough document review and series of focus groups led to the development of a comprehensive model (ACTs Wheels) delineating the requisite steps and skills for community navigation. The model was validated and updated based on observations of 395 actual trips by travellers with navigational challenges from the four participating agencies. Results revealed that the 'ACTs Wheel' models were complete and comprehensive. The 'ACTs Wheels' represent a comprehensive model of the steps needed to navigate to destinations using paratransit and fixed-route public transportation systems for travellers with cognitive impairments. Suggestions are made for future investigations of community transportation for this population.

  13. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  14. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.

    PubMed

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle

    2017-02-01

    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  15. A Detection-Theoretic Model of Echo Inhibition

    ERIC Educational Resources Information Center

    Saberi, Kourosh; Petrosyan, Agavni

    2004-01-01

    A detection-theoretic analysis of the auditory localization of dual-impulse stimuli is described, and a model for the processing of spatial cues in the echo pulse is developed. Although for over 50 years "echo suppression" has been the topic of intense theoretical and empirical study within the hearing sciences, only a rudimentary understanding of…

  16. Theoretical Modeling and Electromagnetic Response of Complex Metamaterials

    DTIC Science & Technology

    2017-03-06

    AFRL-AFOSR-VA-TR-2017-0042 Theoretical Modeling and Electromagnetic Response of Complex Metamaterials Andrea Alu UNIVERSITY OF TEXAS AT AUSTIN Final...Nov 2016 4. TITLE AND SUBTITLE Theoretical Modeling and Electromagnetic Response of Complex Metamaterials 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER...based on parity-time symmetric metasurfaces, and various advances in electromagnetic and acoustic theory and applications. Our findings have opened

  17. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  18. Theoretical validation for changing magnetic fields of systems of permanent magnets of drum separators

    NASA Astrophysics Data System (ADS)

    Lozovaya, S. Y.; Lozovoy, N. M.; Okunev, A. N.

    2018-03-01

    This article is devoted to the theoretical validation of the change in magnetic fields created by the permanent magnet systems of the drum separators. In the article, using the example of a magnetic separator for enrichment of highly magnetic ores, the method of analytical calculation of the magnetic fields of systems of permanent magnets based on the Biot-Savart-Laplace law, the equivalent solenoid method, and the superposition principle of fields is considered.

  19. Noninvasive Determination of Bone Mechanical Properties using Vibration Response: A Refined Model and Validation in vivo

    NASA Technical Reports Server (NTRS)

    Roberts, S. G.; Hutchinson, T. M.; Arnaud, S. B.; Steele, C. R.; Kiratli, B. J.; Martin, R. B.

    1996-01-01

    Accurate non-invasive mechanical measurement of long bones is made difficult by the masking effect of surrounding soft tissues. Mechanical Response Tissue Analysis (MRTA) offers a method for separating the effects of the soft tissue and bone; however, a direct validation has been lacking. A theoretical analysis of wave propagation through the compressed tissue revealed a strong mass effect dependent on the relative accelerations of the probe and bone. The previous mathematical model of the bone and overlying tissue system was reconfigured to incorporate the theoretical finding. This newer model (six-parameter) was used to interpret results using MRTA to determine bone cross-sectional bending stiffness, EI(sub MRTA). The relationship between EI(MRTA) and theoretical EI values for padded aluminum rods was R(exp 2) = 0.999. A biological validation followed using monkey tibias. Each bone was tested in vivo with the MRTA instrument. Postmortem, the same tibias were excised and tested to failure in three-point bending to determine EI(sub 3-PT) and maximum load. Diaphyseal Bone Mineral Density (BMD) measurements were also made. The relationship between E(sub 3-PT) and in vivo EI(sub MRTA) using the six-parameter model is strong (R(exp 2) = 0.947) and better than that using the older model (R(exp 2) = 0.645). EI(MRTA) and BMD are also highly correlated (R(exp 2) = 0.853). MRTA measurements in vivo and BMD ex vivo are both good predictors of scaled maximum strength (R(exp 2) = 0.915 and R(exp 2) = 0.894, respectively). This is the first biological validation of a non-invasive mechanical measurement of bone by comparison to actual values. The MRTA technique has potential clinical value for assessing long-bone mechanical properties.

  20. Noninvasive Determination of Bone Mechanical Properties Using Vibration Response: A Refined Model and Validation in vivo

    NASA Technical Reports Server (NTRS)

    Roberts, S. G.; Hutchinson, T. M.; Arnaud, S. B.; Kiratli, B. J; Steele, C. R.

    1996-01-01

    Accurate non-invasive mechanical measurement of long bones is made difficult by the masking effect of surrounding soft tissues. Mechanical response tissue analysis (MRTA) offers a method for separating the effects of the soft tissue and bone; however, a direct validation has been lacking. A theoretical analysis of wave propagation through the compressed tissue revealed a strong mass effect dependent on the relative accelerations of the probe and bone. The previous mathematical model of the bone and overlying tissue system was reconfigured to incorporate the theoretical finding. This newer model (six-parameter) was used to interpret results using MRTA to determine bone cross-sectional bending stiffness, EI(sub MRTA). The relationship between EI(sub MRTA) and theoretical EI values for padded aluminum rods was R(sup 2) = 0.999. A biological validation followed using monkey tibias. Each bone was tested in vivo with the MRTA instrument. Postmortem, the same tibias were excised and tested to failure in three-point bending to determine EI(sub 3-PT) and maximum load. Diaphyseal bone mineral density (BMD) measurements were also made. The relationship between EI(sub 3-PT) and in vivo EI(sub MRTA) using the six-parameter model is strong (R(sup 2) = 0.947) and better than that using the older model (R(sup 2) = 0.645). EI(sub MRTA) and BMD are also highly correlated (R(sup 2) = 0.853). MRTA measurements in vivo and BMD ex vivo are both good predictors of scaled maximum strength (R(sup 2) = 0.915 and R(sup 2) = 0.894, respectively). This is the first biological validation of a non- invasive mechanical measurement of bone by comparison to actual values. The MRTA technique has potential clinical value for assessing long-bone mechanical properties.

  1. Theoretical models of helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Hawkings, D. L.

    1978-01-01

    For low speed rotors, it is shown that unsteady load models are only partially successful in predicting experimental levels. A theoretical model is presented which leads to the concept of unsteady thickness noise. This gives better agreement with test results. For high speed rotors, it is argued that present models are incomplete and that other mechanisms are at work. Some possibilities are briefly discussed.

  2. A Method of Q-Matrix Validation for the Linear Logistic Test Model

    PubMed Central

    Baghaei, Purya; Hohensinn, Christine

    2017-01-01

    The linear logistic test model (LLTM) is a well-recognized psychometric model for examining the components of difficulty in cognitive tests and validating construct theories. The plausibility of the construct model, summarized in a matrix of weights, known as the Q-matrix or weight matrix, is tested by (1) comparing the fit of LLTM with the fit of the Rasch model (RM) using the likelihood ratio (LR) test and (2) by examining the correlation between the Rasch model item parameters and LLTM reconstructed item parameters. The problem with the LR test is that it is almost always significant and, consequently, LLTM is rejected. The drawback of examining the correlation coefficient is that there is no cut-off value or lower bound for the magnitude of the correlation coefficient. In this article we suggest a simulation method to set a minimum benchmark for the correlation between item parameters from the Rasch model and those reconstructed by the LLTM. If the cognitive model is valid then the correlation coefficient between the RM-based item parameters and the LLTM-reconstructed item parameters derived from the theoretical weight matrix should be greater than those derived from the simulated matrices. PMID:28611721

  3. Theoretical model for plasmonic photothermal response of gold nanostructures solutions

    NASA Astrophysics Data System (ADS)

    Phan, Anh D.; Nga, Do T.; Viet, Nguyen A.

    2018-03-01

    Photothermal effects of gold core-shell nanoparticles and nanorods dispersed in water are theoretically investigated using the transient bioheat equation and the extended Mie theory. Properly calculating the absorption cross section is an extremely crucial milestone to determine the elevation of solution temperature. The nanostructures are assumed to be randomly and uniformly distributed in the solution. Compared to previous experiments, our theoretical temperature increase during laser light illumination provides, in various systems, both reasonable qualitative and quantitative agreement. This approach can be a highly reliable tool to predict photothermal effects in experimentally unexplored structures. We also validate our approach and discuss itslimitations.

  4. Experimentally validated quantitative linear model for the device physics of elastomeric microfluidic valves

    NASA Astrophysics Data System (ADS)

    Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French

    2007-03-01

    A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.

  5. Theoretical Models, Assessment Frameworks and Test Construction.

    ERIC Educational Resources Information Center

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  6. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  7. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  8. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  9. Allostatic load: A theoretical model for understanding the relationship between maternal posttraumatic stress disorder and adverse birth outcomes.

    PubMed

    Li, Yang; Rosemberg, Marie-Anne Sanon; Seng, Julia S

    2018-07-01

    Adverse birth outcomes such as preterm birth and low birth weight are significant public health concerns and contribute to neonatal morbidity and mortality. Studies have increasingly been exploring the predictive effects of maternal posttraumatic stress disorder (PTSD) on adverse birth outcomes. However, the biological mechanisms by which maternal PTSD affects birth outcomes are not well understood. Allostatic load refers to the cumulative dysregulations of the multiple physiological systems as a response to multiple social-ecological levels of chronic stress. Allostatic load has been well documented in relation to both chronic stress and adverse health outcomes in non-pregnant populations. However, the mediating role of allostatic load is less understood when it comes to maternal PTSD and adverse birth outcomes. To propose a theoretical model that depicts how allostatic load could mediate the impact of maternal PTSD on birth outcomes. We followed the procedures for theory synthesis approach described by Walker and Avant (2011), including specifying focal concepts, identifying related factors and relationships, and constructing an integrated representation. We first present a theoretical overview of the allostatic load theory and the other 4 relevant theoretical models. Then we provide a brief narrative review of literature that empirically supports the propositions of the integrated model. Finally, we describe our theoretical model. The theoretical model synthesized has the potential to advance perinatal research by delineating multiple biomarkers to be used in future. After it is well validated, it could be utilized as the theoretical basis for health care professionals to identify high-risk women by evaluating their experiences of psychosocial and traumatic stress and to develop and evaluate service delivery and clinical interventions that might modify maternal perceptions or experiences of stress and eliminate their impacts on adverse birth outcomes. Copyright

  10. From control to causation: Validating a 'complex systems model' of running-related injury development and prevention.

    PubMed

    Hulme, A; Salmon, P M; Nielsen, R O; Read, G J M; Finch, C F

    2017-11-01

    There is a need for an ecological and complex systems approach for better understanding the development and prevention of running-related injury (RRI). In a previous article, we proposed a prototype model of the Australian recreational distance running system which was based on the Systems Theoretic Accident Mapping and Processes (STAMP) method. That model included the influence of political, organisational, managerial, and sociocultural determinants alongside individual-level factors in relation to RRI development. The purpose of this study was to validate that prototype model by drawing on the expertise of both systems thinking and distance running experts. This study used a modified Delphi technique involving a series of online surveys (December 2016- March 2017). The initial survey was divided into four sections containing a total of seven questions pertaining to different features associated with the prototype model. Consensus in opinion about the validity of the prototype model was reached when the number of experts who agreed or disagreed with survey statement was ≥75% of the total number of respondents. A total of two Delphi rounds was needed to validate the prototype model. Out of a total of 51 experts who were initially contacted, 50.9% (n = 26) completed the first round of the Delphi, and 92.3% (n = 24) of those in the first round participated in the second. Most of the 24 full participants considered themselves to be a running expert (66.7%), and approximately a third indicated their expertise as a systems thinker (33.3%). After the second round, 91.7% of the experts agreed that the prototype model was a valid description of the Australian distance running system. This is the first study to formally examine the development and prevention of RRI from an ecological and complex systems perspective. The validated model of the Australian distance running system facilitates theoretical advancement in terms of identifying practical system

  11. [Cognitive Reserve Scale: testing the theoretical model and norms].

    PubMed

    Leon-Estrada, I; Garcia-Garcia, J; Roldan-Tapia, L

    2017-01-01

    The cognitive reserve theory may contribute to explain cognitive performance differences among individuals with similar cognitive decline and among healthy ones. However, more psychometric analysis are needed to guarantee the usage of tests for assessing cognitive reserve. To study validity evidences in relation to the structure of the Cognitive Reserve Scale (CRS) and to create reference norms to interpret the scores. A total of 172 participants completed the scale and they were classified into two age groups: aged 36-64 years (n = 110) and 65-88 years (n = 62). The exploratory factor analysis using ESEM revealed that the data fitted the proposed model. Overall, the discriminative indices were acceptable (between 0.21 and 0.50) and congruence was observed in the periods of young adulthood, adulthood and late adulthood, in both age group. Besides, the index of reliability (Cronbach's alpha: 0.80) and the typical mean error test (mean: 51.40 ± 11.11) showed adequate values for this type of instrument. The CRS seemed to be set under the hypothetical theoretical model, and the scores might be interpreted by the norms showed. This study provided guarantees for the usage of the CRS in research.

  12. Empathy and child neglect: a theoretical model.

    PubMed

    De Paul, Joaquín; Guibert, María

    2008-11-01

    To present an explanatory theory-based model of child neglect. This model does not address neglectful behaviors of parents with mental retardation, alcohol or drug abuse, or severe mental health problems. In this model parental behavior aimed to satisfy a child's need is considered a helping behavior and, as a consequence, child neglect is considered as a specific type of non-helping behavior. The central hypothesis of the theoretical model presented here suggests that neglectful parents cannot develop the helping response set to care for their children because the observation of a child's signal of need does not lead to the experience of emotions that motivate helping or because the parents experience these emotions, but specific cognitions modify the motivation to help. The present theoretical model suggests that different typologies of neglectful parents could be developed based on different reasons that parents might not to experience emotions that motivate helping behaviors. The model can be helpful to promote new empirical studies about the etiology of different groups of neglectful families.

  13. Simplified and quick electrical modeling for dye sensitized solar cells: An experimental and theoretical investigation

    NASA Astrophysics Data System (ADS)

    de Andrade, Rocelito Lopes; de Oliveira, Matheus Costa; Kohlrausch, Emerson Cristofer; Santos, Marcos José Leite

    2018-05-01

    This work presents a new and simple method for determining IPH (current source dependent on luminance), I0 (reverse saturation current), n (ideality factor), RP and RS, (parallel and series resistance) to build an electrical model for dye sensitized solar cells (DSSCs). The electrical circuit parameters used in the simulation and to generate theoretical curves for the single diode electrical model were extracted from I-V curves of assembled DSSCs. Model validation was performed by assembling five different types of DSSCs and evaluating the following parameters: effect of a TiO2 blocking/adhesive layer, thickness of the TiO2 layer and the presence of a light scattering layer. In addition, irradiance, temperature, series and parallel resistance, ideality factor and reverse saturation current were simulated.

  14. Fundamental relationship between the noise properties of grating-based differential phase contrast CT and absorption CT: theoretical framework using a cascaded system model and experimental validation.

    PubMed

    Li, Ke; Bevins, Nicholas; Zambelli, Joseph; Chen, Guang-Hong

    2013-02-01

    Using a grating interferometer, a conventional x-ray cone beam computed tomography (CT) data acquisition system can be used to simultaneously generate both conventional absorption CT (ACT) and differential phase contrast CT (DPC-CT) images from a single data acquisition. Since the two CT images were extracted from the same set of x-ray projections, it is expected that intrinsic relationships exist between the noise properties of the two contrast mechanisms. The purpose of this paper is to investigate these relationships. First, a theoretical framework was developed using a cascaded system model analysis to investigate the relationship between the noise power spectra (NPS) of DPC-CT and ACT. Based on the derived analytical expressions of the NPS, the relationship between the spatial-frequency-dependent noise equivalent quanta (NEQ) of DPC-CT and ACT was derived. From these fundamental relationships, the NPS and NEQ of the DPC-CT system can be derived from the corresponding ACT system or vice versa. To validate these theoretical relationships, a benchtop cone beam DPC-CT/ACT system was used to experimentally measure the modulation transfer function (MTF) and NPS of both DPC-CT and ACT. The measured three-dimensional (3D) MTF and NPS were then combined to generate the corresponding 3D NEQ. Two fundamental relationships have been theoretically derived and experimentally validated for the NPS and NEQ of DPC-CT and ACT: (1) the 3D NPS of DPC-CT is quantitatively related to the corresponding 3D NPS of ACT by an inplane-only spatial-frequency-dependent factor 1∕f (2), the ratio of window functions applied to DPC-CT and ACT, and a numerical factor C(g) determined by the geometry and efficiency of the grating interferometer. Note that the frequency-dependent factor is independent of the frequency component f(z) perpendicular to the axial plane. (2) The 3D NEQ of DPC-CT is related to the corresponding 3D NEQ of ACT by an f (2) scaling factor and numerical factors that

  15. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    NASA Astrophysics Data System (ADS)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  16. Development and Validation of a Mathematical Model for Olive Oil Oxidation

    NASA Astrophysics Data System (ADS)

    Rahmouni, K.; Bouhafa, H.; Hamdi, S.

    2009-03-01

    A mathematical model describing the stability or the susceptibility to oxidation of extra virgin olive oil has been developed. The model has been resolved by an iterative method using differential finite method. It was validated by experimental data of extra virgin olive oil (EVOO) oxidation. EVOO stability was tested by using a Rancimat at four different temperatures 60, 70, 80 and 90° C until peroxide accumulation reached 20 [meq/kg]. Peroxide formation is speed relatively slow; fits zero order reaction with linear regression coefficients varying from 0, 98 to 0, 99. The mathematical model was used to predict the shelf life of bulk conditioned olive oil. This model described peroxide accumulation inside a container in excess of oxygen as a function of time at various positions from the interface air/oil. Good correlations were obtained between theoretical and experimental values.

  17. Testing a Theoretical Model of Immigration Transition and Physical Activity.

    PubMed

    Chang, Sun Ju; Im, Eun-Ok

    2015-01-01

    The purposes of the study were to develop a theoretical model to explain the relationships between immigration transition and midlife women's physical activity and test the relationships among the major variables of the model. A theoretical model, which was developed based on transitions theory and the midlife women's attitudes toward physical activity theory, consists of 4 major variables, including length of stay in the United States, country of birth, level of acculturation, and midlife women's physical activity. To test the theoretical model, a secondary analysis with data from 127 Hispanic women and 123 non-Hispanic (NH) Asian women in a national Internet study was used. Among the major variables of the model, length of stay in the United States was negatively associated with physical activity in Hispanic women. Level of acculturation in NH Asian women was positively correlated with women's physical activity. Country of birth and level of acculturation were significant factors that influenced physical activity in both Hispanic and NH Asian women. The findings support the theoretical model that was developed to examine relationships between immigration transition and physical activity; it shows that immigration transition can play an essential role in influencing health behaviors of immigrant populations in the United States. The NH theoretical model can be widely used in nursing practice and research that focus on immigrant women and their health behaviors. Health care providers need to consider the influences of immigration transition to promote immigrant women's physical activity.

  18. Theoretical relationship between vibration transmissibility and driving-point response functions of the human body.

    PubMed

    Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z

    2013-11-25

    The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.

  19. Expanding Panjabi's stability model to express movement: a theoretical model.

    PubMed

    Hoffman, J; Gabel, P

    2013-06-01

    Novel theoretical models of movement have historically inspired the creation of new methods for the application of human movement. The landmark theoretical model of spinal stability by Panjabi in 1992 led to the creation of an exercise approach to spinal stability. This approach however was later challenged, most significantly due to a lack of favourable clinical effect. The concepts explored in this paper address and consider the deficiencies of Panjabi's model then propose an evolution and expansion from a special model of stability to a general one of movement. It is proposed that two body-wide symbiotic elements are present within all movement systems, stability and mobility. The justification for this is derived from the observable clinical environment. It is clinically recognised that these two elements are present and identifiable throughout the body in different joints and muscles, and the neural conduction system. In order to generalise the Panjabi model of stability to include and illustrate movement, a matching parallel mobility system with the same subsystems was conceptually created. In this expanded theoretical model, the new mobility system is placed beside the existing stability system and subsystems. The ability of both stability and mobility systems to work in harmony will subsequently determine the quality of movement. Conversely, malfunction of either system, or their subsystems, will deleteriously affect all other subsystems and consequently overall movement quality. For this reason, in the rehabilitation exercise environment, focus should be placed on the simultaneous involvement of both the stability and mobility systems. It is suggested that the individual's relevant functional harmonious movements should be challenged at the highest possible level without pain or discomfort. It is anticipated that this conceptual expansion of the theoretical model of stability to one with the symbiotic inclusion of mobility, will provide new understandings

  20. Three-dimensional localized coherent structures of surface turbulence: Model validation with experiments and further computations.

    PubMed

    Demekhin, E A; Kalaidin, E N; Kalliadasis, S; Vlaskin, S Yu

    2010-09-01

    We validate experimentally the Kapitsa-Shkadov model utilized in the theoretical studies by Demekhin [Phys. Fluids 19, 114103 (2007)10.1063/1.2793148; Phys. Fluids 19, 114104 (2007)]10.1063/1.2793149 of surface turbulence on a thin liquid film flowing down a vertical planar wall. For water at 15° , surface turbulence typically occurs at an inlet Reynolds number of ≃40 . Of particular interest is to assess experimentally the predictions of the model for three-dimensional nonlinear localized coherent structures, which represent elementary processes of surface turbulence. For this purpose we devise simple experiments to investigate the instabilities and transitions leading to such structures. Our experimental results are in good agreement with the theoretical predictions of the model. We also perform time-dependent computations for the formation of coherent structures and their interaction with localized structures of smaller amplitude on the surface of the film.

  1. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  2. Empirical agreement in model validation.

    PubMed

    Jebeile, Julie; Barberousse, Anouk

    2016-04-01

    Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Theoretical foundation, methods, and criteria for calibrating human vibration models using frequency response functions

    PubMed Central

    Dong, Ren G.; Welcome, Daniel E.; McDowell, Thomas W.; Wu, John Z.

    2015-01-01

    While simulations of the measured biodynamic responses of the whole human body or body segments to vibration are conventionally interpreted as summaries of biodynamic measurements, and the resulting models are considered quantitative, this study looked at these simulations from a different angle: model calibration. The specific aims of this study are to review and clarify the theoretical basis for model calibration, to help formulate the criteria for calibration validation, and to help appropriately select and apply calibration methods. In addition to established vibration theory, a novel theorem of mechanical vibration is also used to enhance the understanding of the mathematical and physical principles of the calibration. Based on this enhanced understanding, a set of criteria was proposed and used to systematically examine the calibration methods. Besides theoretical analyses, a numerical testing method is also used in the examination. This study identified the basic requirements for each calibration method to obtain a unique calibration solution. This study also confirmed that the solution becomes more robust if more than sufficient calibration references are provided. Practically, however, as more references are used, more inconsistencies can arise among the measured data for representing the biodynamic properties. To help account for the relative reliabilities of the references, a baseline weighting scheme is proposed. The analyses suggest that the best choice of calibration method depends on the modeling purpose, the model structure, and the availability and reliability of representative reference data. PMID:26740726

  4. Dynamics in Higher Education Politics: A Theoretical Model

    ERIC Educational Resources Information Center

    Kauko, Jaakko

    2013-01-01

    This article presents a model for analysing dynamics in higher education politics (DHEP). Theoretically the model draws on the conceptual history of political contingency, agenda-setting theories and previous research on higher education dynamics. According to the model, socio-historical complexity can best be analysed along two dimensions: the…

  5. The limits of crop productivity: validating theoretical estimates and determining the factors that limit crop yields in optimal environments

    NASA Technical Reports Server (NTRS)

    Bugbee, B.; Monje, O.

    1992-01-01

    Plant scientists have sought to maximize the yield of food crops since the beginning of agriculture. There are numerous reports of record food and biomass yields (per unit area) in all major crop plants, but many of the record yield reports are in error because they exceed the maximal theoretical rates of the component processes. In this article, we review the component processes that govern yield limits and describe how each process can be individually measured. This procedure has helped us validate theoretical estimates and determine what factors limit yields in optimal environments.

  6. A non-traditional fluid problem: transition between theoretical models from Stokes’ to turbulent flow

    NASA Astrophysics Data System (ADS)

    Salomone, Horacio D.; Olivieri, Néstor A.; Véliz, Maximiliano E.; Raviola, Lisandro A.

    2018-05-01

    In the context of fluid mechanics courses, it is customary to consider the problem of a sphere falling under the action of gravity inside a viscous fluid. Under suitable assumptions, this phenomenon can be modelled using Stokes’ law and is routinely reproduced in teaching laboratories to determine terminal velocities and fluid viscosities. In many cases, however, the measured physical quantities show important deviations with respect to the predictions deduced from the simple Stokes’ model, and the causes of these apparent ‘anomalies’ (for example, whether the flow is laminar or turbulent) are seldom discussed in the classroom. On the other hand, there are various variable-mass problems that students tackle during elementary mechanics courses and which are discussed in many textbooks. In this work, we combine both kinds of problems and analyse—both theoretically and experimentally—the evolution of a system composed of a sphere pulled by a chain of variable length inside a tube filled with water. We investigate the effects of different forces acting on the system such as weight, buoyancy, viscous friction and drag force. By means of a sequence of mathematical models of increasing complexity, we obtain a progressive fit that accounts for the experimental data. The contrast between the various models exposes the strengths and weaknessess of each one. The proposed experience can be useful for integrating concepts of elementary mechanics and fluids, and is suitable as laboratory practice, stressing the importance of the experimental validation of theoretical models and showing the model-building processes in a didactic framework.

  7. Theoretical Foundation for Weld Modeling

    NASA Technical Reports Server (NTRS)

    Traugott, S.

    1986-01-01

    Differential equations describe physics of tungsten/inert-gas and plasma-arc welding in aluminum. Report collects and describes necessary theoretical foundation upon which numerical welding model is constructed for tungsten/inert gas or plasma-arc welding in aluminum without keyhole. Governing partial differential equations for flow of heat, metal, and current given, together with boundary conditions relevant to welding process. Numerical estimates for relative importance of various phenomena and required properties of 2219 aluminum included

  8. Hybrid quantum teleportation: A theoretical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Shuntaro; Mizuta, Takahiro; Fuwa, Maria

    2014-12-04

    Hybrid quantum teleportation – continuous-variable teleportation of qubits – is a promising approach for deterministically teleporting photonic qubits. We propose how to implement it with current technology. Our theoretical model shows that faithful qubit transfer can be achieved for this teleportation by choosing an optimal gain for the teleporter’s classical channel.

  9. Stakeholder validation of a model of readiness for transition to adult care.

    PubMed

    Schwartz, Lisa A; Brumley, Lauren D; Tuchman, Lisa K; Barakat, Lamia P; Hobbie, Wendy L; Ginsberg, Jill P; Daniel, Lauren C; Kazak, Anne E; Bevans, Katherine; Deatrick, Janet A

    2013-10-01

    That too few youth with special health care needs make the transition to adult-oriented health care successfully may be due, in part, to lack of readiness to transfer care. There is a lack of theoretical models to guide development and implementation of evidence-based guidelines, assessments, and interventions to improve transition readiness. To further validate the Social-ecological Model of Adolescent and Young Adult Readiness to Transition (SMART) via feedback from stakeholders (patients, parents, and providers) from a medically diverse population in need of life-long follow-up care, survivors of childhood cancer. Mixed-methods participatory research design. A large Mid-Atlantic children's hospital. Adolescent and young adult survivors of childhood cancer (n = 14), parents (n = 18), and pediatric providers (n = 10). Patients and parents participated in focus groups; providers participated in individual semi-structured interviews. Validity of SMART was assessed 3 ways: (1) ratings on importance of SMART components for transition readiness using a 5-point scale (0-4; ratings >2 support validity), (2) nominations of 3 "most important" components, and (3) directed content analysis of focus group/interview transcripts. Qualitative data supported the validity of SMART, with minor modifications to definitions of components. Quantitative ratings met criteria for validity; stakeholders endorsed all components of SMART as important for transition. No additional SMART variables were suggested by stakeholders and the "most important" components varied by stakeholders, thus supporting the comprehensiveness of SMART and need to involve multiple perspectives. SMART represents a comprehensive and empirically validated framework for transition research and program planning, supported by survivors of childhood cancer, parents, and pediatric providers. Future research should validate SMART among other populations with special health care needs.

  10. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    NASA Astrophysics Data System (ADS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-05-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress.

  11. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    USGS Publications Warehouse

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  12. Theoretical Calculation and Validation of the Water Vapor Continuum Absorption

    NASA Technical Reports Server (NTRS)

    Ma, Qiancheng; Tipping, Richard H.

    1998-01-01

    The primary objective of this investigation is the development of an improved parameterization of the water vapor continuum absorption through the refinement and validation of our existing theoretical formalism. The chief advantage of our approach is the self-consistent, first principles, basis of the formalism which allows us to predict the frequency, temperature and pressure dependence of the continuum absorption as well as provide insights into the physical mechanisms responsible for the continuum absorption. Moreover, our approach is such that the calculated continuum absorption can be easily incorporated into satellite retrieval algorithms and climate models. Accurate determination of the water vapor continuum is essential for the next generation of retrieval algorithms which propose to use the combined constraints of multispectral measurements such as those under development for EOS data analysis (e.g., retrieval algorithms based on MODIS and AIRS measurements); current Pathfinder activities which seek to use the combined constraints of infrared and microwave (e.g., HIRS and MSU) measurements to improve temperature and water profile retrievals, and field campaigns which seek to reconcile spectrally-resolved and broad-band measurements such as those obtained as part of FIRE. Current widely used continuum treatments have been shown to produce spectrally dependent errors, with the magnitude of the error dependent on temperature and abundance which produces errors with a seasonal and latitude dependence. Translated into flux, current water vapor continuum parameterizations produce flux errors of order 10 W/sq m, which compared to the 4 W/sq m magnitude of the greenhouse gas forcing and the 1-2 W/sq m estimated aerosol forcing is certainly climatologically significant and unacceptably large. While it is possible to tune the empirical formalisms, the paucity of laboratory measurements, especially at temperatures of interest for atmospheric applications, preclude

  13. Theoretical Calculation and Validation of the Water Vapor Continuum Absorption

    NASA Technical Reports Server (NTRS)

    Ma, Qiancheng; Tipping, Richard H.

    1998-01-01

    The primary objective of this investigation is the development of an improved parameterization of the water vapor continuum absorption through the refinement and validation of our existing theoretical formalism. The chief advantage of our approach is the self-consistent, first principles, basis of the formalism which allows us to predict the frequency, temperature and pressure dependence of the continuum absorption as well as provide insights into the physical mechanisms responsible for the continuum absorption. Moreover, our approach is such that the calculated continuum absorption can be easily incorporated into satellite retrieval algorithms and climate models. Accurate determination of the water vapor continuum is essential for the next generation of retrieval algorithms which propose to use the combined constraints of multi-spectral measurements such as those under development for EOS data analysis (e.g., retrieval algorithms based on MODIS and AIRS measurements); current Pathfinder activities which seek to use the combined constraints of infrared and microwave (e.g., HIRS and MSU) measurements to improve temperature and water profile retrievals, and field campaigns which seek to reconcile spectrally-resolved and broad-band measurements such as those obtained as part of FIRE. Current widely used continuum treatments have been shown to produce spectrally dependent errors, with the magnitude of the error dependent on temperature and abundance which produces errors with a seasonal and latitude dependence. Translated into flux, current water vapor continuum parameterizations produce flux errors of order 10 W/ml, which compared to the 4 W/m' magnitude of the greenhouse gas forcing and the 1-2 W/m' estimated aerosol forcing is certainly climatologically significant and unacceptably large. While it is possible to tune the empirical formalisms, the paucity of laboratory measurements, especially at temperatures of interest for atmospheric applications, preclude tuning

  14. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  15. Network-Theoretic Modeling of Fluid Flow

    DTIC Science & Technology

    2015-07-29

    Final Report STIR: Network-Theoretic Modeling of Fluid Flow ARO Grant W911NF-14-1-0386 Program manager: Dr. Samuel Stanton ( August 1, 2014–April 30...Morzyński, M., and Comte , P., “A finite-time thermodynamics of unsteady fluid flows,” Journal of Non-Equilibrium Thermody- namics, Vol. 33, No. 2

  16. A theoretical model for analysing gender bias in medicine.

    PubMed

    Risberg, Gunilla; Johansson, Eva E; Hamberg, Katarina

    2009-08-03

    During the last decades research has reported unmotivated differences in the treatment of women and men in various areas of clinical and academic medicine. There is an ongoing discussion on how to avoid such gender bias. We developed a three-step-theoretical model to understand how gender bias in medicine can occur and be understood. In this paper we present the model and discuss its usefulness in the efforts to avoid gender bias. In the model gender bias is analysed in relation to assumptions concerning difference/sameness and equity/inequity between women and men. Our model illustrates that gender bias in medicine can arise from assuming sameness and/or equity between women and men when there are genuine differences to consider in biology and disease, as well as in life conditions and experiences. However, gender bias can also arise from assuming differences when there are none, when and if dichotomous stereotypes about women and men are understood as valid. This conceptual thinking can be useful for discussing and avoiding gender bias in clinical work, medical education, career opportunities and documents such as research programs and health care policies. Too meet the various forms of gender bias, different facts and measures are needed. Knowledge about biological differences between women and men will not reduce bias caused by gendered stereotypes or by unawareness of health problems and discrimination associated with gender inequity. Such bias reflects unawareness of gendered attitudes and will not change by facts only. We suggest consciousness-rising activities and continuous reflections on gender attitudes among students, teachers, researchers and decision-makers.

  17. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Model-theoretic framework for sensor data fusion

    NASA Astrophysics Data System (ADS)

    Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.

    1993-09-01

    The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.

  19. Optimal control model predictions of system performance and attention allocation and their experimental validation in a display design study

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Govindaraj, T.

    1980-01-01

    The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.

  20. Turbine Engine Mathematical Model Validation

    DTIC Science & Technology

    1976-12-01

    AEDC-TR-76-90 ~Ec i ? Z985 TURBINE ENGINE MATHEMATICAL MODEL VALIDATION ENGINE TEST FACILITY ARNOLD ENGINEERING DEVELOPMENT CENTER AIR FORCE...i f n e c e s e a ~ ~ d i den t i f y by b l ock number) YJI01-GE-100 engine turbine engines mathematical models computations mathematical...report presents and discusses the results of an investigation to develop a rationale and technique for the validation of turbine engine steady-state

  1. Dependence of tropical cyclone development on coriolis parameter: A theoretical model

    NASA Astrophysics Data System (ADS)

    Deng, Liyuan; Li, Tim; Bi, Mingyu; Liu, Jia; Peng, Melinda

    2018-03-01

    A simple theoretical model was formulated to investigate how tropical cyclone (TC) intensification depends on the Coriolis parameter. The theoretical framework includes a two-layer free atmosphere and an Ekman boundary layer at the bottom. The linkage between the free atmosphere and the boundary layer is through the Ekman pumping vertical velocity in proportion to the vorticity at the top of the boundary layer. The closure of this linear system assumes a simple relationship between the free atmosphere diabatic heating and the boundary layer moisture convergence. Under a set of realistic atmospheric parameter values, the model suggests that the most preferred latitude for TC development is around 5° without considering other factors. The theoretical result is confirmed by high-resolution WRF model simulations in a zero-mean flow and a constant SST environment on an f -plane with different Coriolis parameters. Given an initially balanced weak vortex, the TC-like vortex intensifies most rapidly at the reference latitude of 5°. Thus, the WRF model simulations confirm the f-dependent characteristics of TC intensification rate as suggested by the theoretical model.

  2. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less

  3. A Generalized Information Theoretical Model for Quantum Secret Sharing

    NASA Astrophysics Data System (ADS)

    Bai, Chen-Ming; Li, Zhi-Hui; Xu, Ting-Ting; Li, Yong-Ming

    2016-11-01

    An information theoretical model for quantum secret sharing was introduced by H. Imai et al. (Quantum Inf. Comput. 5(1), 69-80 2005), which was analyzed by quantum information theory. In this paper, we analyze this information theoretical model using the properties of the quantum access structure. By the analysis we propose a generalized model definition for the quantum secret sharing schemes. In our model, there are more quantum access structures which can be realized by our generalized quantum secret sharing schemes than those of the previous one. In addition, we also analyse two kinds of important quantum access structures to illustrate the existence and rationality for the generalized quantum secret sharing schemes and consider the security of the scheme by simple examples.

  4. Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.

    PubMed

    Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A

    2008-03-01

    In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.

  5. Graph theoretical modeling of baby brain networks.

    PubMed

    Zhao, Tengda; Xu, Yuehua; He, Yong

    2018-06-12

    The human brain undergoes explosive growth during the prenatal period and the first few postnatal years, establishing an early infrastructure for the later development of behaviors and cognitions. Revealing the developmental rules during the early phrase is essential in understanding the emergence of brain function and the origin of developmental disorders. The graph-theoretical network modeling in combination with multiple neuroimaging probes provides an important research framework to explore early development of the topological wiring and organizational paradigms of the brain. Here, we reviewed studies which employed neuroimaging and graph-theoretical modeling to investigate brain network development from approximately 20 gestational weeks to 2 years of age. Specifically, the structural and functional brain networks have evolved to highly efficient topological architectures in the early stage; where the structural network remains ahead and paves the way for the development of functional network. The brain network develops in a heterogeneous order, from primary to higher-order systems and from a tendency of network segregation to network integration in the prenatal and postnatal periods. The early brain network topologies show abilities in predicting certain cognitive and behavior performance in later life, and their impairments are likely to continue into childhood and even adulthood. These macroscopic topological changes are found to be associated with possible microstructural maturations, such as axonal growth and myelinations. Collectively, this review provides a detailed delineation of the early changes of the baby brains in the graph-theoretical modeling framework, which opens up a new avenue to understand the developmental principles of the connectome. Copyright © 2018. Published by Elsevier Inc.

  6. Global precipitation measurements for validating climate models

    NASA Astrophysics Data System (ADS)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  7. Beware of external validation! - A Comparative Study of Several Validation Techniques used in QSAR Modelling.

    PubMed

    Majumdar, Subhabrata; Basak, Subhash C

    2018-04-26

    Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. A review of game-theoretic models of road user behaviour.

    PubMed

    Elvik, Rune

    2014-01-01

    This paper reviews game-theoretic models that have been developed to explain road user behaviour in situations where road users interact with each other. The paper includes the following game-theoretic models: 1.A general model of the interaction between road users and their possible reaction to measures improving safety (behavioural adaptation).2.Choice of vehicle size as a Prisoners’ dilemma game.3.Speed choice as a co-ordination game.4.Speed compliance as a game between drivers and the police.5.Merging into traffic from an acceleration lane as a mixed-strategy game.6.Choice of level of attention in following situations as an evolutionary game.7.Choice of departure time to avoid congestion as variant of a Prisoners’ dilemma game.8.Interaction between cyclists crossing the road and car drivers.9.Dipping headlights at night well ahead of the point when glare becomes noticeable.10.Choice of evasive action in a situation when cars are on collision course. The models reviewed are different in many respects, but a common feature of the models is that they can explain how informal norms of behaviour can develop among road users and be sustained even if these informal norms violate the formal regulations of the traffic code. Game-theoretic models are not applicable to every conceivable interaction between road users or to situations in which road users choose behaviour without interacting with other road users. Nevertheless, it is likely that game-theoretic models can be applied more widely than they have been until now. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A new theoretical framework for modeling respiratory protection based on the beta distribution.

    PubMed

    Klausner, Ziv; Fattal, Eyal

    2014-08-01

    The problem of modeling respiratory protection is well known and has been dealt with extensively in the literature. Often the efficiency of respiratory protection is quantified in terms of penetration, defined as the proportion of an ambient contaminant concentration that penetrates the respiratory protection equipment. Typically, the penetration modeling framework in the literature is based on the assumption that penetration measurements follow the lognormal distribution. However, the analysis in this study leads to the conclusion that the lognormal assumption is not always valid, making it less adequate for analyzing respiratory protection measurements. This work presents a formulation of the problem from first principles, leading to a stochastic differential equation whose solution is the probability density function of the beta distribution. The data of respiratory protection experiments were reexamined, and indeed the beta distribution was found to provide the data a better fit than the lognormal. We conclude with a suggestion for a new theoretical framework for modeling respiratory protection. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  10. Information-Theoretic Benchmarking of Land Surface Models

    NASA Astrophysics Data System (ADS)

    Nearing, Grey; Mocko, David; Kumar, Sujay; Peters-Lidard, Christa; Xia, Youlong

    2016-04-01

    Benchmarking is a type of model evaluation that compares model performance against a baseline metric that is derived, typically, from a different existing model. Statistical benchmarking was used to qualitatively show that land surface models do not fully utilize information in boundary conditions [1] several years before Gong et al [2] discovered the particular type of benchmark that makes it possible to *quantify* the amount of information lost by an incorrect or imperfect model structure. This theoretical development laid the foundation for a formal theory of model benchmarking [3]. We here extend that theory to separate uncertainty contributions from the three major components of dynamical systems models [4]: model structures, model parameters, and boundary conditions describe time-dependent details of each prediction scenario. The key to this new development is the use of large-sample [5] data sets that span multiple soil types, climates, and biomes, which allows us to segregate uncertainty due to parameters from the two other sources. The benefit of this approach for uncertainty quantification and segregation is that it does not rely on Bayesian priors (although it is strictly coherent with Bayes' theorem and with probability theory), and therefore the partitioning of uncertainty into different components is *not* dependent on any a priori assumptions. We apply this methodology to assess the information use efficiency of the four land surface models that comprise the North American Land Data Assimilation System (Noah, Mosaic, SAC-SMA, and VIC). Specifically, we looked at the ability of these models to estimate soil moisture and latent heat fluxes. We found that in the case of soil moisture, about 25% of net information loss was from boundary conditions, around 45% was from model parameters, and 30-40% was from the model structures. In the case of latent heat flux, boundary conditions contributed about 50% of net uncertainty, and model structures contributed

  11. Uncertainties and understanding of experimental and theoretical results regarding reactions forming heavy and superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Giardina, G.; Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; Fazio, G.

    2018-02-01

    Experimental and theoretical results of the PCN fusion probability of reactants in the entrance channel and the Wsur survival probability against fission at deexcitation of the compound nucleus formed in heavy-ion collisions are discussed. The theoretical results for a set of nuclear reactions leading to formation of compound nuclei (CNs) with the charge number Z = 102- 122 reveal a strong sensitivity of PCN to the characteristics of colliding nuclei in the entrance channel, dynamics of the reaction mechanism, and excitation energy of the system. We discuss the validity of assumptions and procedures for analysis of experimental data, and also the limits of validity of theoretical results obtained by the use of phenomenological models. The comparison of results obtained in many investigated reactions reveals serious limits of validity of the data analysis and calculation procedures.

  12. A combined theoretical and in vitro modeling approach for predicting the magnetic capture and retention of magnetic nanoparticles in vivo

    PubMed Central

    David, Allan E.; Cole, Adam J.; Chertok, Beata; Park, Yoon Shin; Yang, Victor C.

    2011-01-01

    Magnetic nanoparticles (MNP) continue to draw considerable attention as potential diagnostic and therapeutic tools in the fight against cancer. Although many interacting forces present themselves during magnetic targeting of MNP to tumors, most theoretical considerations of this process ignore all except for the magnetic and drag forces. Our validation of a simple in vitro model against in vivo data, and subsequent reproduction of the in vitro results with a theoretical model indicated that these two forces do indeed dominate the magnetic capture of MNP. However, because nanoparticles can be subject to aggregation, and large MNP experience an increased magnetic force, the effects of surface forces on MNP stability cannot be ignored. We accounted for the aggregating surface forces simply by measuring the size of MNP retained from flow by magnetic fields, and utilized this size in the mathematical model. This presumably accounted for all particle-particle interactions, including those between magnetic dipoles. Thus, our “corrected” mathematical model provided a reasonable estimate of not only fractional MNP retention, but also predicted the regions of accumulation in a simulated capillary. Furthermore, the model was also utilized to calculate the effects of MNP size and spatial location, relative to the magnet, on targeting of MNPs to tumors. This combination of an in vitro model with a theoretical model could potentially assist with parametric evaluations of magnetic targeting, and enable rapid enhancement and optimization of magnetic targeting methodologies. PMID:21295085

  13. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  14. Theoretical model for optical properties of porphyrin

    NASA Astrophysics Data System (ADS)

    Phan, Anh D.; Nga, Do T.; Phan, The-Long; Thanh, Le T. M.; Anh, Chu T.; Bernad, Sophie; Viet, N. A.

    2014-12-01

    We propose a simple model to interpret the optical absorption spectra of porphyrin in different solvents. Our model successfully explains the decrease in the intensity of optical absorption at maxima of increased wavelengths. We also prove the dependence of the intensity and peak positions in the absorption spectra on the environment. The nature of the Soret band is supposed to derive from π plasmon. Our theoretical calculations are consistent with previous experimental studies.

  15. A theoretical model for analysing gender bias in medicine

    PubMed Central

    Risberg, Gunilla; Johansson, Eva E; Hamberg, Katarina

    2009-01-01

    During the last decades research has reported unmotivated differences in the treatment of women and men in various areas of clinical and academic medicine. There is an ongoing discussion on how to avoid such gender bias. We developed a three-step-theoretical model to understand how gender bias in medicine can occur and be understood. In this paper we present the model and discuss its usefulness in the efforts to avoid gender bias. In the model gender bias is analysed in relation to assumptions concerning difference/sameness and equity/inequity between women and men. Our model illustrates that gender bias in medicine can arise from assuming sameness and/or equity between women and men when there are genuine differences to consider in biology and disease, as well as in life conditions and experiences. However, gender bias can also arise from assuming differences when there are none, when and if dichotomous stereotypes about women and men are understood as valid. This conceptual thinking can be useful for discussing and avoiding gender bias in clinical work, medical education, career opportunities and documents such as research programs and health care policies. Too meet the various forms of gender bias, different facts and measures are needed. Knowledge about biological differences between women and men will not reduce bias caused by gendered stereotypes or by unawareness of health problems and discrimination associated with gender inequity. Such bias reflects unawareness of gendered attitudes and will not change by facts only. We suggest consciousness-rising activities and continuous reflections on gender attitudes among students, teachers, researchers and decision-makers. PMID:19646289

  16. Propagation studies using a theoretical ionosphere model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M.K.

    1973-03-01

    The mid-latitude ionospheric and neutral atmospheric models are coupled with an advanced three dimensional ray-tracing pron predicting the wave propagation conditions and to study to what extent the use of theoretical ionospheric models is practical. The Penn State MK 1 ionospheric model, the Mitra--Rowe D-region model, and the Groves' neutral atmospheric model are used throughout ihis work to represent the real electron densities and collision frequencies. The Faraday rotation and differential Doppler velocities from satellites, the propagation modes for long-distance high-frequency propagation, the group delays for each mode, the ionospheric absorption, and the spatial loss are all predicted. (auth) (STAR)

  17. A computational model-based validation of Guyton's analysis of cardiac output and venous return curves

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.; Cohen, R. J.; Mark, R. G.

    2002-01-01

    Guyton developed a popular approach for understanding the factors responsible for cardiac output (CO) regulation in which 1) the heart-lung unit and systemic circulation are independently characterized via CO and venous return (VR) curves, and 2) average CO and right atrial pressure (RAP) of the intact circulation are predicted by graphically intersecting the curves. However, this approach is virtually impossible to verify experimentally. We theoretically evaluated the approach with respect to a nonlinear, computational model of the pulsatile heart and circulation. We developed two sets of open circulation models to generate CO and VR curves, differing by the manner in which average RAP was varied. One set applied constant RAPs, while the other set applied pulsatile RAPs. Accurate prediction of intact, average CO and RAP was achieved only by intersecting the CO and VR curves generated with pulsatile RAPs because of the pulsatility and nonlinearity (e.g., systemic venous collapse) of the intact model. The CO and VR curves generated with pulsatile RAPs were also practically independent. This theoretical study therefore supports the validity of Guyton's graphical analysis.

  18. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  19. Validating EHR clinical models using ontology patterns.

    PubMed

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models

    NASA Astrophysics Data System (ADS)

    Jacquot, K.; Chevrier, C.; Halin, G.

    2013-02-01

    Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.

  1. Validation of urban freeway models.

    DOT National Transportation Integrated Search

    2015-01-01

    This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...

  2. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  3. Pain assessment in children: theoretical and empirical validity.

    PubMed

    Villarruel, A M; Denyes, M J

    1991-12-01

    Valid assessment of pain in children is foundational for both the nursing practice and research domains, yet few validated methods of pain measurement are currently available for young children. This article describes an innovative research approach used in the development of photographic instruments to measure pain intensity in young African-American and Hispanic children. The instruments were designed to enable children to participate actively in their own care and to do so in ways that are congruent with their developmental and cultural heritage. Conceptualization of the instruments, methodological development, and validation processes grounded in Orem's Self-Care Deficit Theory of Nursing are described. The authors discuss the ways in which the gaps between nursing theory, research, and practice are narrowed when development of instruments to measure clinical nursing phenomena are grounded in nursing theory, validated through research and utilized in practice settings.

  4. Smooth particle hydrodynamic modeling and validation for impact bird substitution

    NASA Astrophysics Data System (ADS)

    Babu, Arun; Prasad, Ganesh

    2018-04-01

    Bird strike events incidentally occur and can at times be fatal for air frame structures. Federal Aviation Regulations (FAR) and such other ones mandates aircrafts to be modeled to withstand various levels of bird hit damages. The subject matter of this paper is numerical modeling of a soft body geometry for realistically substituting an actual bird for carrying out simulations of bird hit on target structures. Evolution of such a numerical code to effect an actual bird behavior through impact is much desired for making use of the state of the art computational facilities in simulating bird strike events. Validity, of simulations depicting bird hits, is largely dependent on the correctness of the bird model. In an impact, a set of complex and coupled dynamic interaction exists between the target and the impactor. To simplify this problem, impactor response needs to be decoupled from that of the target. This can be done by assuming and modeling the target as noncompliant. Bird is assumed as fluidic in a impact. Generated stresses in the bird body are significant than its yield stresses. Hydrodynamic theory is most ideal for describing this problem. Impactor literally flows steadily over the target for most part of this problem. The impact starts with an initial shock and falls into a radial release shock regime. Subsequently a steady flow is established in the bird body and this phase continues till the whole length of the bird body is turned around. Initial shock pressure and steady state pressure are ideal variables for comparing and validating the bird model. Spatial discretization of the bird is done using Smooth Particle Hydrodynamic (SPH) approach. This Discrete Element Model (DEM) offers significant advantages over other contemporary approaches. Thermodynamic state variable relations are established using Polynomial Equation of State (EOS). ANSYS AUTODYN is used to perform the explicit dynamic simulation of the impact event. Validation of the shock and steady

  5. Propagation studies using a theoretical ionosphere model

    NASA Technical Reports Server (NTRS)

    Lee, M.

    1973-01-01

    The mid-latitude ionospheric and neutral atmospheric models are coupled with an advanced three dimensional ray tracing program to see what success would be obtained in predicting the wave propagation conditions and to study to what extent the use of theoretical ionospheric models is practical. The Penn State MK 1 ionospheric model, the Mitra-Rowe D region model, and the Groves' neutral atmospheric model are used throughout this work to represent the real electron densities and collision frequencies. The Faraday rotation and differential Doppler velocities from satellites, the propagation modes for long distance high frequency propagation, the group delays for each mode, the ionospheric absorption, and the spatial loss are all predicted.

  6. Theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains

    NASA Astrophysics Data System (ADS)

    Zhu, Ning; Sun, Shou-Guang; Li, Qiang; Zou, Hua

    2014-12-01

    One of the major problems in structural fatigue life analysis is establishing structural load spectra under actual operating conditions. This study conducts theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains. The quasistatic load series that corresponds to quasi-static deformation modes are identified according to the structural form and bearing conditions of high-speed train bogie frames. Moreover, a force-measuring frame is designed and manufactured based on the quasi-static load series. The load decoupling model of the quasi-static load series is then established via calibration tests. Quasi-static load-time histories, together with online tests and decoupling analysis, are obtained for the intermediate range of the Beijing—Shanghai dedicated passenger line. The damage consistency calibration of the quasi-static discrete load spectra is performed according to a damage consistency criterion and a genetic algorithm. The calibrated damage that corresponds with the quasi-static discrete load spectra satisfies the safety requirements of bogie frames.

  7. Critical validity assessment of theoretical models: charge-exchange at intermediate and high energies

    NASA Astrophysics Data System (ADS)

    Belkić, Dževad

    1999-06-01

    Exact comprehensive computations are carried out by means of four leading second-order approximations yielding differential cross sections dQ/ dΩ for the basic charge exchange process H ++H(1s)→H(1s)+H + at intermediate and high energies. The obtained extensive set of results is thoroughly tested against all the existing experimental data with the purpose of critically assessing the validity of the boundary corrected second-Born (CB2), continuum-distorted wave (CDW), impulse approximation (IA) and the reformulated impulse approximation (RIA). The conclusion which emerges from this comparative study clearly indicates that the RIA agrees most favorably with the measurements available over a large energy range 25 keV-5 MeV. Such a finding reaffirms the few-particle quantum scattering theory which imposes several strict conditions on adequate second-order methods. These requirements satisfied by the RIA are: (i) normalisations of all the scattering wave functions, (ii) correct boundary conditions in both entrance and exit channels, (iii) introduction of a mathematically justified two-center continuum state for the sum of an attractive and a repulsive Coulomb potential with the same interaction strength, (iv) inclusion of the multiple scattering effects neglected in the IA, (v) a proper description of the Thomas double scattering in good agreement with the experiments and without any unobserved peak splittings. Nevertheless, the performed comparative analysis of the above four approximations indicates that none of the methods is free from some basic shortcomings. Despite its success, the RIA remains essentially a high-energy model like the other three methods under study. More importantly, their perturbative character leaves virtually no room for further systematic improvements, since the neglected higher-order terms are prohibitively tedious for practical purposes and have never been computed exactly. To bridge this gap, we presently introduce the variational Pad

  8. Microstructural Characterization of Metal Foams: An Examination of the Applicability of the Theoretical Models for Modeling Foams

    NASA Technical Reports Server (NTRS)

    Raj, S. V.

    2010-01-01

    Establishing the geometry of foam cells is useful in developing microstructure-based acoustic and structural models. Since experimental data on the geometry of the foam cells are limited, most modeling efforts use the three-dimensional, space-filling Kelvin tetrakaidecahedron. The validity of this assumption is investigated in the present paper. Several FeCrAlY foams with relative densities varying between 3 and 15 percent and cells per mm (c.p.mm.) varying between 0.2 and 3.9 c.p.mm. were microstructurally evaluated. The number of edges per face for each foam specimen was counted by approximating the cell faces by regular polygons, where the number of cell faces measured varied between 207 and 745. The present observations revealed that 50 to 57 percent of the cell faces were pentagonal while 24 to 28 percent were quadrilateral and 15 to 22 percent were hexagonal. The present measurements are shown to be in excellent agreement with literature data. It is demonstrated that the Kelvin model, as well as other proposed theoretical models, cannot accurately describe the FeCrAlY foam cell structure. Instead, it is suggested that the ideal foam cell geometry consists of 11 faces with 3 quadrilateral, 6 pentagonal faces and 2 hexagonal faces consistent with the 3-6-2 cell.

  9. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE PAGES

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.; ...

    2015-04-06

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  10. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  11. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models.

    PubMed

    Rao, Nageswara S V; Poole, Stephen W; Ma, Chris Y T; He, Fei; Zhuang, Jun; Yau, David K Y

    2016-04-01

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities, expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical subinfrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein their components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures, are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. The analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures. © 2015 Society for Risk Analysis.

  12. Testing a theoretical model of clinical nurses' intent to stay.

    PubMed

    Cowden, Tracy L; Cummings, Greta G

    2015-01-01

    Published theoretical models of nurses' intent to stay (ITS) report inconsistent outcomes, and not all hypothesized models have been adequately tested. Research has focused on cognitive rather than emotional determinants of nurses' ITS. The aim of this study was to empirically verify a complex theoretical model of nurses' ITS that includes both affective and cognitive determinants and to explore the influence of relational leadership on staff nurses' ITS. The study was a correlational, mixed-method, nonexperimental design. A subsample of the Quality Work Environment Study survey data 2009 (n = 415 nurses) was used to test our theoretical model of clinical nurses' ITS as a structural equation model. The model explained 63% of variance in ITS. Organizational commitment, empowerment, and desire to stay were the model concepts with the strongest effects on nurses' ITS. Leadership practices indirectly influenced ITS. How nurses evaluate and respond to their work environment is both an emotional and rational process. Health care organizations need to be cognizant of the influence that nurses' feelings and views of their work setting have on their intention decisions and integrate that knowledge into the development of retention strategies. Leadership practices play an important role in staff nurses' perceptions of the workplace. Identifying the mechanisms by which leadership influences staff nurses' intentions to stay presents additional focus areas for developing retention strategies.

  13. Producibility improvements suggested by a validated process model of seeded CdZnTe vertical Bridgman growth

    NASA Astrophysics Data System (ADS)

    Larson, David J., Jr.; Casagrande, Louis G.; Di Marzio, Don; Levy, Alan; Carlson, Frederick M.; Lee, Taipao; Black, David R.; Wu, Jun; Dudley, Michael

    1994-07-01

    We have successfully validated theoretical models of seeded vertical Bridgman-Stockbarger CdZnTe crystal growth and post-solidification processing, using in-situ thermal monitoring and innovative material characterization techniques. The models predict the thermal gradients, interface shape, fluid flow and solute redistribution during solidification, as well as the distributions of accumulated excess stress that causes defect generation and redistribution. Data from the furnace and ampoule wall have validated predictions from the thermal model. Results are compared to predictions of the thermal and thermo-solutal models. We explain the measured initial, change-of-rate, and terminal compositional transients as well as the macrosegregation. Macro and micro-defect distributions have been imaged on CdZnTe wafers from 40 mm diameter boules. Superposition of topographic defect images and predicted excess stress patterns suggests the origin of some frequently encountered defects, particularly on a macro scale, to result from the applied and accumulated stress fields and the anisotropic nature of the CdZnTe crystal. Implications of these findings with respect to producibility are discussed.

  14. The Johns Hopkins model of psychological first aid (RAPID-PFA): curriculum development and content validation.

    PubMed

    Everly, George S; Barnett, Daniel J; Links, Jonathan M

    2012-01-01

    There appears to be virtual universal endorsement of the need for and value of acute "psychological first aid" (PFA) in the wake of trauma and disasters. In this paper, we describe the development of the curriculum for The Johns Hopkins RAPID-PFA model of psychological first aid. We employed an adaptation of the basic framework for the development of a clinical science as recommended by Millon which entailed: historical review, theoretical development, and content validation. The process of content validation of the RAPID-PFA curriculum entailed the assessment of attitudes (confidence in the application of PFA interventions, preparedness in the application of PFA); knowledge related to the application of immediate mental health interventions; and behavior (the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise). Results of the content validation phase suggest the six-hour RAPID-PFA curriculum, initially based upon structural modeling analysis, can improve confidence in the application of PFA interventions, preparedness in the application of PFA, knowledge related to the application of immediate mental health interventions, and the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise.

  15. [Validation of a triage scale: first step in patient admission and in emergency service models].

    PubMed

    Legrand, A; Thys, F; Vermeiren, E; Touwaide, M; D'Hoore, W; Hubin, V; Reynaert, M S

    2003-03-01

    At present, most emergency services handle the multitude of various demands in the same unity of place and by the same team of nurses aides, with direct consequences on the waiting time and in the handling of problems of varying degrees of importance. Our service examines other administrative models based on a triage of time and of orientation. In a prospective study on 679 patients, we have validated a triage tool inspired from the ICEM model (International Cooperation of Emergency Medicine) allowing patients to receive, while they wait, information and training, based on the resources provided, in order to deal with their particular medical problem. The validation of this tool was carried out in terms of its utilization as well as its reliability. It appears that, with the type of triage offered, there is a theoretical reserve of waiting time for the patients in which the urgency is relative, and which could be better used in the handling of more vital cases.

  16. The problem of fouling in submerged membrane bioreactors - Model validation and experimental evidence

    NASA Astrophysics Data System (ADS)

    Tsibranska, Irene; Vlaev, Serafim; Tylkowski, Bartosz

    2018-01-01

    Integrating biological treatment with membrane separation has found a broad area of applications and industrial attention. Submerged membrane bioreactors (SMBRs), based on membrane modules immersed in the bioreactor, or side stream ones connected in recycle have been employed in different biotechnological processes for separation of thermally unstable products. Fouling is one of the most important challenges in the integrated SMBRs. A number of works are devoted to fouling analysis and its treatment, especially exploring the opportunity for enhanced fouling control in SMBRs. The main goal of the review is to provide a comprehensive yet concise overview of modeling the fouling in SMBRs in view of the problematics of model validation, either by real system measurements at different scales or by analysis of the obtained theoretical results. The review is focused on the current state of research applying computational fluid dynamics (CFD) modeling techniques.

  17. Theoretical and methodological issues with testing the SCCT and RIASEC models: Comment on Lent, Sheu, and Brown (2010) and Lubinski (2010).

    PubMed

    Armstrong, Patrick Ian; Vogel, David L

    2010-04-01

    The current article replies to comments made by Lent, Sheu, and Brown (2010) and Lubinski (2010) regarding the study "Interpreting the Interest-Efficacy Association From a RIASEC Perspective" (Armstrong & Vogel, 2009). The comments made by Lent et al. and Lubinski highlight a number of important theoretical and methodological issues, including the process of defining and differentiating between constructs, the assumptions underlying Holland's (1959, 1997) RIASEC (Realistic, Investigative, Artistic, Social, Enterprising, and Conventional types) model and interrelations among constructs specified in social cognitive career theory (SCCT), the importance of incremental validity for evaluating constructs, and methodological considerations when quantifying interest-efficacy correlations and for comparing models using multivariate statistical methods. On the basis of these comments and previous research on the SCCT and Holland models, we highlight the importance of considering multiple theoretical perspectives in vocational research and practice. Alternative structural models are outlined for examining the role of interests, self-efficacy, learning experiences, outcome expectations, personality, and cognitive abilities in the career choice and development process. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  18. Chemometric Methods and Theoretical Molecular Descriptors in Predictive QSAR Modeling of the Environmental Behavior of Organic Pollutants

    NASA Astrophysics Data System (ADS)

    Gramatica, Paola

    This chapter surveys the QSAR modeling approaches (developed by the author's research group) for the validated prediction of environmental properties of organic pollutants. Various chemometric methods, based on different theoretical molecular descriptors, have been applied: explorative techniques (such as PCA for ranking, SOM for similarity analysis), modeling approaches by multiple-linear regression (MLR, in particular OLS), and classification methods (mainly k-NN, CART, CP-ANN). The focus of this review is on the main topics of environmental chemistry and ecotoxicology, related to the physico-chemical properties, the reactivity, and biological activity of chemicals of high environmental concern. Thus, the review deals with atmospheric degradation reactions of VOCs by tropospheric oxidants, persistence and long-range transport of POPs, sorption behavior of pesticides (Koc and leaching), bioconcentration, toxicity (acute aquatic toxicity, mutagenicity of PAHs, estrogen binding activity for endocrine disruptors compounds (EDCs)), and finally persistent bioaccumulative and toxic (PBT) behavior for the screening and prioritization of organic pollutants. Common to all the proposed models is the attention paid to model validation for predictive ability (not only internal, but also external for chemicals not participating in the model development) and checking of the chemical domain of applicability. Adherence to such a policy, requested also by the OECD principles, ensures the production of reliable predicted data, useful also in the new European regulation of chemicals, REACH.

  19. Effects of human running cadence and experimental validation of the bouncing ball model

    NASA Astrophysics Data System (ADS)

    Bencsik, László; Zelei, Ambrus

    2017-05-01

    The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.

  20. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  1. An analytic solution for numerical modeling validation in electromagnetics: the resistive sphere

    NASA Astrophysics Data System (ADS)

    Swidinsky, Andrei; Liu, Lifei

    2017-11-01

    We derive the electromagnetic response of a resistive sphere to an electric dipole source buried in a conductive whole space. The solution consists of an infinite series of spherical Bessel functions and associated Legendre polynomials, and follows the well-studied problem of a conductive sphere buried in a resistive whole space in the presence of a magnetic dipole. Our result is particularly useful for controlled-source electromagnetic problems using a grounded electric dipole transmitter and can be used to check numerical methods of calculating the response of resistive targets (such as finite difference, finite volume, finite element and integral equation). While we elect to focus on the resistive sphere in our examples, the expressions in this paper are completely general and allow for arbitrary source frequency, sphere radius, transmitter position, receiver position and sphere/host conductivity contrast so that conductive target responses can also be checked. Commonly used mesh validation techniques consist of comparisons against other numerical codes, but such solutions may not always be reliable or readily available. Alternatively, the response of simple 1-D models can be tested against well-known whole space, half-space and layered earth solutions, but such an approach is inadequate for validating models with curved surfaces. We demonstrate that our theoretical results can be used as a complementary validation tool by comparing analytic electric fields to those calculated through a finite-element analysis; the software implementation of this infinite series solution is made available for direct and immediate application.

  2. A clinical reasoning model focused on clients' behaviour change with reference to physiotherapists: its multiphase development and validation.

    PubMed

    Elvén, Maria; Hochwälder, Jacek; Dean, Elizabeth; Söderlund, Anne

    2015-05-01

    A biopsychosocial approach and behaviour change strategies have long been proposed to serve as a basis for addressing current multifaceted health problems. This emphasis has implications for clinical reasoning of health professionals. This study's aim was to develop and validate a conceptual model to guide physiotherapists' clinical reasoning focused on clients' behaviour change. Phase 1 consisted of the exploration of existing research and the research team's experiences and knowledge. Phases 2a and 2b consisted of validation and refinement of the model based on input from physiotherapy students in two focus groups (n = 5 per group) and from experts in behavioural medicine (n = 9). Phase 1 generated theoretical and evidence bases for the first version of a model. Phases 2a and 2b established the validity and value of the model. The final model described clinical reasoning focused on clients' behaviour change as a cognitive, reflective, collaborative and iterative process with multiple interrelated levels that included input from the client and physiotherapist, a functional behavioural analysis of the activity-related target behaviour and the selection of strategies for behaviour change. This unique model, theory- and evidence-informed, has been developed to help physiotherapists to apply clinical reasoning systematically in the process of behaviour change with their clients.

  3. Theoretical models for coronary vascular biomechanics: Progress & challenges

    PubMed Central

    Waters, Sarah L.; Alastruey, Jordi; Beard, Daniel A.; Bovendeerd, Peter H.M.; Davies, Peter F.; Jayaraman, Girija; Jensen, Oliver E.; Lee, Jack; Parker, Kim H.; Popel, Aleksander S.; Secomb, Timothy W.; Siebes, Maria; Sherwin, Spencer J.; Shipley, Rebecca J.; Smith, Nicolas P.; van de Vosse, Frans N.

    2013-01-01

    A key aim of the cardiac Physiome Project is to develop theoretical models to simulate the functional behaviour of the heart under physiological and pathophysiological conditions. Heart function is critically dependent on the delivery of an adequate blood supply to the myocardium via the coronary vasculature. Key to this critical function of the coronary vasculature is system dynamics that emerge via the interactions of the numerous constituent components at a range of spatial and temporal scales. Here, we focus on several components for which theoretical approaches can be applied, including vascular structure and mechanics, blood flow and mass transport, flow regulation, angiogenesis and vascular remodelling, and vascular cellular mechanics. For each component, we summarise the current state of the art in model development, and discuss areas requiring further research. We highlight the major challenges associated with integrating the component models to develop a computational tool that can ultimately be used to simulate the responses of the coronary vascular system to changing demands and to diseases and therapies. PMID:21040741

  4. Theoretical Models of Protostellar Binary and Multiple Systems with AMR Simulations

    NASA Astrophysics Data System (ADS)

    Matsumoto, Tomoaki; Tokuda, Kazuki; Onishi, Toshikazu; Inutsuka, Shu-ichiro; Saigo, Kazuya; Takakuwa, Shigehisa

    2017-05-01

    We present theoretical models for protostellar binary and multiple systems based on the high-resolution numerical simulation with an adaptive mesh refinement (AMR) code, SFUMATO. The recent ALMA observations have revealed early phases of the binary and multiple star formation with high spatial resolutions. These observations should be compared with theoretical models with high spatial resolutions. We present two theoretical models for (1) a high density molecular cloud core, MC27/L1521F, and (2) a protobinary system, L1551 NE. For the model for MC27, we performed numerical simulations for gravitational collapse of a turbulent cloud core. The cloud core exhibits fragmentation during the collapse, and dynamical interaction between the fragments produces an arc-like structure, which is one of the prominent structures observed by ALMA. For the model for L1551 NE, we performed numerical simulations of gas accretion onto protobinary. The simulations exhibit asymmetry of a circumbinary disk. Such asymmetry has been also observed by ALMA in the circumbinary disk of L1551 NE.

  5. Experimental validation of a linear model for data reduction in chirp-pulse microwave CT.

    PubMed

    Miyakawa, M; Orikasa, K; Bertero, M; Boccacci, P; Conte, F; Piana, M

    2002-04-01

    Chirp-pulse microwave computerized tomography (CP-MCT) is an imaging modality developed at the Department of Biocybernetics, University of Niigata (Niigata, Japan), which intends to reduce the microwave-tomography problem to an X-ray-like situation. We have recently shown that data acquisition in CP-MCT can be described in terms of a linear model derived from scattering theory. In this paper, we validate this model by showing that the theoretically computed response function is in good agreement with the one obtained from a regularized multiple deconvolution of three data sets measured with the prototype of CP-MCT. Furthermore, the reliability of the model as far as image restoration in concerned, is tested in the case of space-invariant conditions by considering the reconstruction of simple on-axis cylindrical phantoms.

  6. Validating neural-network refinements of nuclear mass models

    NASA Astrophysics Data System (ADS)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  7. Validation of a model for the cast-film process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambon, F.; Ohlsson, S.; Silagy, D.

    1996-12-31

    We have developed a model of the cast-film process and compared theoretical predictions against experiments on a pilot line. Three polyethylenes with a markedly different level of melt elasticity were used in this evaluation; namely, a high pressure low density polyethylene, LDPE, and two linear low density polyethylenes, LLDPE-1 and LLDPE-2. The final film dimensions of the LDPE were found to be in good agreement with 1-D viscoelastic stationary predictions. Flow field visualization experiments indicate, however, a 2-D velocity field in the airgap between the extrusion die and the chill roll. Taking this observation into account, evolutions of the freemore » surface of the web along the airgap were recorded with LLDPE-2, our least elastic melt. An excellent agreement is found between these measurements and predictions of neck-in and edge bead with 2-D Newtonian stationary simulations. The time-dependent solution, which is based on a linear stability analysis, allows to identify a zone of draw resonance within the working space of the process, defined by the draw ratio, the Deborah number, and the web aspect ratio. It is predicted that increasing this latter parameter stabilizes the process until an optimum value is reached. Experiments with LLDPE-1 are shown to validate this unique theoretical result, thus allowing to increase the draw ratio by about 75%.« less

  8. The role of global cloud climatologies in validating numerical models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1991-01-01

    Reliable estimates of the components of the surface radiation budget are important in studies of ocean-atmosphere interaction, land-atmosphere interaction, ocean circulation and in the validation of radiation schemes used in climate models. The methods currently under consideration must necessarily make certain assumptions regarding both the presence of clouds and their vertical extent. Because of the uncertainties in assumed cloudiness, all these methods involve perhaps unacceptable uncertainties. Here, a theoretical framework that avoids the explicit computation of cloud fraction and the location of cloud base in estimating the surface longwave radiation is presented. Estimates of the global surface downward fluxes and the oceanic surface net upward fluxes were made for four months (April, July, October and January) in 1985 to 1986. These estimates are based on a relationship between cloud radiative forcing at the top of the atmosphere and the surface obtained from a general circulation model. The radiation code is the version used in the UCLA/GLA general circulation model (GCM). The longwave cloud radiative forcing at the top of the atmosphere as obtained from Earth Radiation Budget Experiment (ERBE) measurements is used to compute the forcing at the surface by means of the GCM-derived relationship. This, along with clear-sky fluxes from the computations, yield maps of the downward longwave fluxes and net upward longwave fluxes at the surface. The calculated results are discussed and analyzed. The results are consistent with current meteorological knowledge and explainable on the basis of previous theoretical and observational works; therefore, it can be concluded that this method is applicable as one of the ways to obtain the surface longwave radiation fields from currently available satellite data.

  9. New Theoretical Model of Nerve Conduction in Unmyelinated Nerves

    PubMed Central

    Akaishi, Tetsuya

    2017-01-01

    Nerve conduction in unmyelinated fibers has long been described based on the equivalent circuit model and cable theory. However, without the change in ionic concentration gradient across the membrane, there would be no generation or propagation of the action potential. Based on this concept, we employ a new conductive model focusing on the distribution of voltage-gated sodium ion channels and Coulomb force between electrolytes. Based on this new model, the propagation of the nerve conduction was suggested to take place far before the generation of action potential at each channel. We theoretically showed that propagation of action potential, which is enabled by the increasing Coulomb force produced by inflowing sodium ions, from one sodium ion channel to the next sodium channel would be inversely proportionate to the density of sodium channels on the axon membrane. Because the longitudinal number of sodium ion channel would be proportionate to the square root of channel density, the conduction velocity of unmyelinated nerves is theoretically shown to be proportionate to the square root of channel density. Also, from a viewpoint of equilibrium state of channel importation and degeneration, channel density was suggested to be proportionate to axonal diameter. Based on these simple basis, conduction velocity in unmyelinated nerves was theoretically shown to be proportionate to the square root of axonal diameter. This new model would also enable us to acquire more accurate and understandable vision on the phenomena in unmyelinated nerves in addition to the conventional electric circuit model and cable theory. PMID:29081751

  10. Modelling human skull growth: a validated computational model

    PubMed Central

    Marghoub, Arsalan; Johnson, David; Khonsari, Roman H.; Fagan, Michael J.; Moazen, Mehran

    2017-01-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions (n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. PMID:28566514

  11. Modelling human skull growth: a validated computational model.

    PubMed

    Libby, Joseph; Marghoub, Arsalan; Johnson, David; Khonsari, Roman H; Fagan, Michael J; Moazen, Mehran

    2017-05-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions ( n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. © 2017 The Author(s).

  12. Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilches-Freixas, Gloria; Létang, Jean Michel; Rit,

    2016-09-15

    Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performedmore » at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation

  13. External validation of preexisting first trimester preeclampsia prediction models.

    PubMed

    Allen, Rebecca E; Zamora, Javier; Arroyo-Manzano, David; Velauthar, Luxmilar; Allotey, John; Thangaratinam, Shakila; Aquilina, Joseph

    2017-10-01

    To validate the increasing number of prognostic models being developed for preeclampsia using our own prospective study. A systematic review of literature that assessed biomarkers, uterine artery Doppler and maternal characteristics in the first trimester for the prediction of preeclampsia was performed and models selected based on predefined criteria. Validation was performed by applying the regression coefficients that were published in the different derivation studies to our cohort. We assessed the models discrimination ability and calibration. Twenty models were identified for validation. The discrimination ability observed in derivation studies (Area Under the Curves) ranged from 0.70 to 0.96 when these models were validated against the validation cohort, these AUC varied importantly, ranging from 0.504 to 0.833. Comparing Area Under the Curves obtained in the derivation study to those in the validation cohort we found statistically significant differences in several studies. There currently isn't a definitive prediction model with adequate ability to discriminate for preeclampsia, which performs as well when applied to a different population and can differentiate well between the highest and lowest risk groups within the tested population. The pre-existing large number of models limits the value of further model development and future research should be focussed on further attempts to validate existing models and assessing whether implementation of these improves patient care. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  14. A Theoretical Math Model for Projecting AIS3+ Thoracic Injury for Belted Occupants in Frontal Impact.

    PubMed

    Laituri, Tony R; Sullivan, Donald; Sullivan, Kaye; Prasad, Priya

    2004-11-01

    A theoretical math model was created to assess the net effect of aging populations versus evolving system designs from the standpoint of thoracic injury potential. The model was used to project the next twenty-five years of thoracic injuries in Canada. The choice of Canada was topical because rulemaking for CMVSS 208 has been proposed recently. The study was limited to properly-belted, front-outboard, adult occupants in 11-1 o'clock frontal crashes. Moreover, only AIS3+ thoracic injury potential was considered. The research consisted of four steps. First, sub-models were developed and integrated. The sub-models were made for numerous real-world effects including population growth, crash involvement, fleet penetration of various systems (via system introduction, vehicle production, and vehicle attrition), and attendant injury risk estimation. Second, existing NASS data were used to estimate the number of AIS3+ chest-injured drivers in Canada in 2001. This served as data for model validation. Third, the projection model was correlated favorably with the 2001 field estimate. Finally, for the scenario that 2004-2030 model-year systems would perform like 2000-2003 model-year systems, a projection was made to estimate the long-term effect of eliminating designs that would not comply with the proposed CMVSS 208. The 2006-2030-projection result for this scenario: 764 occupants would benefit from the proposed regulation. This projection was considered to be conservative because future innovation was not considered, and, to date, the fleet's average chest deflections have been decreasing. The model also predicted that, through 2016, the effect of improving system performance would be more influential than the population-aging effect; thereafter, the population-aging effect would somewhat counteract the effect of improving system performance. This theoretical math model can provide insights for both designers and rule makers.

  15. Validation of the generalized model of two-phase thermosyphon loop based on experimental measurements of volumetric flow rate

    NASA Astrophysics Data System (ADS)

    Bieliński, Henryk

    2016-09-01

    The current paper presents the experimental validation of the generalized model of the two-phase thermosyphon loop. The generalized model is based on mass, momentum, and energy balances in the evaporators, rising tube, condensers and the falling tube. The theoretical analysis and the experimental data have been obtained for a new designed variant. The variant refers to a thermosyphon loop with both minichannels and conventional tubes. The thermosyphon loop consists of an evaporator on the lower vertical section and a condenser on the upper vertical section. The one-dimensional homogeneous and separated two-phase flow models were used in calculations. The latest minichannel heat transfer correlations available in literature were applied. A numerical analysis of the volumetric flow rate in the steady-state has been done. The experiment was conducted on a specially designed test apparatus. Ultrapure water was used as a working fluid. The results show that the theoretical predictions are in good agreement with the measured volumetric flow rate at steady-state.

  16. Calibration and validation of rockfall models

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  17. Theoretical modeling of critical temperature increase in metamaterial superconductors

    NASA Astrophysics Data System (ADS)

    Smolyaninov, Igor I.; Smolyaninova, Vera N.

    2016-05-01

    Recent experiments have demonstrated that the metamaterial approach is capable of a drastic increase of the critical temperature Tc of epsilon near zero (ENZ) metamaterial superconductors. For example, tripling of the critical temperature has been observed in Al -A l2O3 ENZ core-shell metamaterials. Here, we perform theoretical modeling of Tc increase in metamaterial superconductors based on the Maxwell-Garnett approximation of their dielectric response function. Good agreement is demonstrated between theoretical modeling and experimental results in both aluminum- and tin-based metamaterials. Taking advantage of the demonstrated success of this model, the critical temperature of hypothetic niobium-, Mg B2- , and H2S -based metamaterial superconductors is evaluated. The Mg B2 -based metamaterial superconductors are projected to reach the liquid nitrogen temperature range. In the case of a H2S -based metamaterial Tc appears to reach ˜250 K.

  18. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  19. A theoretical model of job retention for home health care nurses.

    PubMed

    Ellenbecker, Carol Hall

    2004-08-01

    Predicted severe nursing shortages and an increasing demand for home health care services have made the retention of experienced, qualified nursing staff a priority for health care organizations. The purpose of this paper is to describe a theoretical model of job retention for home health care nurses. The theoretical model is an integration of the findings of empirical research related to intent to stay and retention, components of Neal's theory of home health care nursing practice and findings from earlier work to develop an instrument to measure home health care nurses' job satisfaction. The theoretical model identifies antecedents to job satisfaction of home health care nurses. The antecedents are intrinsic and extrinsic job characteristics. The model also proposes that job satisfaction is directly related to retention and indirectly related to retention though intent to stay. Individual nurse characteristics are indirectly related to retention through intent to stay. The individual characteristic of tenure is indirectly related to retention through autonomy, as an intrinsic characteristic of job satisfaction, and intent to stay. The proposed model can be used to guide research that explores gaps in knowledge about intent to stay and retention among home health care nurses.

  20. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  1. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  2. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  3. Simple theoretical models for composite rotor blades

    NASA Technical Reports Server (NTRS)

    Valisetty, R. R.; Rehfield, L. W.

    1984-01-01

    The development of theoretical rotor blade structural models for designs based upon composite construction is discussed. Care was exercised to include a member of nonclassical effects that previous experience indicated would be potentially important to account for. A model, representative of the size of a main rotor blade, is analyzed in order to assess the importance of various influences. The findings of this model study suggest that for the slenderness and closed cell construction considered, the refinements are of little importance and a classical type theory is adequate. The potential of elastic tailoring is dramatically demonstrated, so the generality of arbitrary ply layup in the cell wall is needed to exploit this opportunity.

  4. Theoretical modeling of critical temperature increase in metamaterial superconductors

    NASA Astrophysics Data System (ADS)

    Smolyaninov, Igor; Smolyaninova, Vera

    Recent experiments have demonstrated that the metamaterial approach is capable of drastic increase of the critical temperature Tc of epsilon near zero (ENZ) metamaterial superconductors. For example, tripling of the critical temperature has been observed in Al-Al2O3 ENZ core-shell metamaterials. Here, we perform theoretical modelling of Tc increase in metamaterial superconductors based on the Maxwell-Garnett approximation of their dielectric response function. Good agreement is demonstrated between theoretical modelling and experimental results in both aluminum and tin-based metamaterials. Taking advantage of the demonstrated success of this model, the critical temperature of hypothetic niobium, MgB2 and H2S-based metamaterial superconductors is evaluated. The MgB2-based metamaterial superconductors are projected to reach the liquid nitrogen temperature range. In the case of an H2S-based metamaterial Tc appears to reach 250 K. This work was supported in part by NSF Grant DMR-1104676 and the School of Emerging Technologies at Towson University.

  5. A validation study of a stochastic model of human interaction

    NASA Astrophysics Data System (ADS)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  6. Model Validation Against The Modelers’ Data Archive

    DTIC Science & Technology

    2014-08-01

    completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release

  7. Theoretical Model of Development of Information Competence among Students Enrolled in Elective Courses

    ERIC Educational Resources Information Center

    Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis

    2016-01-01

    The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…

  8. Application of aerosol speciation data as an in situ dust proxy for validation of the Dust Regional Atmospheric Model (DREAM)

    NASA Astrophysics Data System (ADS)

    Shaw, Patrick

    The Dust REgional Atmospheric Model (DREAM) predicts concentrations of mineral dust aerosols in time and space, but validation is challenging with current in situ particulate matter (PM) concentration measurements. Measured levels of ambient PM often contain anthropogenic components as well as windblown mineral dust. In this study, two approaches to model validation were performed with data from preexisting air quality monitoring networks: using hourly concentrations of total PM with aerodynamic diameter less than 2.5 μm (PM 2.5); and using a daily averaged speciation-derived soil component. Validation analyses were performed for point locations within the cities of El Paso (TX), Austin (TX), Phoenix (AZ), Salt Lake City (UT) and Bakersfield (CA) for most of 2006. Hourly modeled PM 2.5 did not validate at all with hourly observations among the sites (combined R < 0.00, N = 24,302 hourly values). Aerosol chemical speciation data distinguished between mineral (soil) dust from anthropogenic ambient PM. As expected, statistically significant improvements in correlation among all stations (combined R = 0.16, N = 343 daily values) were found when the soil component alone was used to validate DREAM. The validation biases that result from anthropogenic aerosols were also reduced using the soil component. This is seen in the reduction of the root mean square error between hourly in situ versus hourly modeled (RMSE hourly = 18.6 μg m -3) and 24-h in situ speciation values versus daily averaged observed (RMSE soil = 12.0 μg m -3). However, the lack of a total reduction in RMSE indicates there is still room for improvement in the model. While the soil component is the theoretical proxy of choice for a dust transport model, the current sparse and infrequent sampling is not ideal for routine hourly air quality forecast validation.

  9. Development and validation of the Bullying and Cyberbullying Scale for Adolescents: A multi-dimensional measurement model.

    PubMed

    Thomas, Hannah J; Scott, James G; Coates, Jason M; Connor, Jason P

    2018-05-03

    Intervention on adolescent bullying is reliant on valid and reliable measurement of victimization and perpetration experiences across different behavioural expressions. This study developed and validated a survey tool that integrates measurement of both traditional and cyber bullying to test a theoretically driven multi-dimensional model. Adolescents from 10 mainstream secondary schools completed a baseline and follow-up survey (N = 1,217; M age  = 14 years; 66.2% male). The Bullying and cyberbullying Scale for Adolescents (BCS-A) developed for this study comprised parallel victimization and perpetration subscales, each with 20 items. Additional measures of bullying (Olweus Global Bullying and the Forms of Bullying Scale [FBS]), as well as measures of internalizing and externalizing problems, school connectedness, social support, and personality, were used to further assess validity. Factor structure was determined, and then, the suitability of items was assessed according to the following criteria: (1) factor interpretability, (2) item correlations, (3) model parsimony, and (4) measurement equivalence across victimization and perpetration experiences. The final models comprised four factors: physical, verbal, relational, and cyber. The final scale was revised to two 13-item subscales. The BCS-A demonstrated acceptable concurrent and convergent validity (internalizing and externalizing problems, school connectedness, social support, and personality), as well as predictive validity over 6 months. The BCS-A has sound psychometric properties. This tool establishes measurement equivalence across types of involvement and behavioural forms common among adolescents. An improved measurement method could add greater rigour to the evaluation of intervention programmes and also enable interventions to be tailored to subscale profiles. © 2018 The British Psychological Society.

  10. A beginner's guide to writing the nursing conceptual model-based theoretical rationale.

    PubMed

    Gigliotti, Eileen; Manister, Nancy N

    2012-10-01

    Writing the theoretical rationale for a study can be a daunting prospect for novice researchers. Nursing's conceptual models provide excellent frameworks for placement of study variables, but moving from the very abstract concepts of the nursing model to the less abstract concepts of the study variables is difficult. Similar to the five-paragraph essay used by writing teachers to assist beginning writers to construct a logical thesis, the authors of this column present guidelines that beginners can follow to construct their theoretical rationale. This guide can be used with any nursing conceptual model but Neuman's model was chosen here as the exemplar.

  11. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  12. Theoretical accuracy in cosmological growth estimation

    NASA Astrophysics Data System (ADS)

    Bose, Benjamin; Koyama, Kazuya; Hellwing, Wojciech A.; Zhao, Gong-Bo; Winther, Hans A.

    2017-07-01

    We elucidate the importance of the consistent treatment of gravity-model specific nonlinearities when estimating the growth of cosmological structures from redshift space distortions (RSD). Within the context of standard perturbation theory (SPT), we compare the predictions of two theoretical templates with redshift space data from COLA (comoving Lagrangian acceleration) simulations in the normal branch of DGP gravity (nDGP) and general relativity (GR). Using COLA for these comparisons is validated using a suite of full N-body simulations for the same theories. The two theoretical templates correspond to the standard general relativistic perturbation equations and those same equations modeled within nDGP. Gravitational clustering nonlinear effects are accounted for by modeling the power spectrum up to one-loop order and redshift space clustering anisotropy is modeled using the Taruya, Nishimichi and Saito (TNS) RSD model. Using this approach, we attempt to recover the simulation's fiducial logarithmic growth parameter f . By assigning the simulation data with errors representing an idealized survey with a volume of 10 Gpc3/h3 , we find the GR template is unable to recover fiducial f to within 1 σ at z =1 when we match the data up to kmax=0.195 h /Mpc . On the other hand, the DGP template recovers the fiducial value within 1 σ . Further, we conduct the same analysis for sets of mock data generated for generalized models of modified gravity using SPT, where again we analyze the GR template's ability to recover the fiducial value. We find that for models with enhanced gravitational nonlinearity, the theoretical bias of the GR template becomes significant for stage IV surveys. Thus, we show that for the future large data volume galaxy surveys, the self-consistent modeling of non-GR gravity scenarios will be crucial in constraining theory parameters.

  13. Self-Assembled Magnetic Surface Swimmers: Theoretical Model

    NASA Astrophysics Data System (ADS)

    Aranson, Igor; Belkin, Maxim; Snezhko, Alexey

    2009-03-01

    The mechanisms of self-propulsion of living microorganisms are a fascinating phenomenon attracting enormous attention in the physics community. A new type of self-assembled micro-swimmers, magnetic snakes, is an excellent tool to model locomotion in a simple table-top experiment. The snakes self-assemble from a dispersion of magnetic microparticles suspended on the liquid-air interface and subjected to an alternating magnetic field. Formation and dynamics of these swimmers are captured in the framework of theoretical model coupling paradigm equation for the amplitude of surface waves, conservation law for the density of particles, and the Navier-Stokes equation for hydrodynamic flows. The results of continuum modeling are supported by hybrid molecular dynamics simulations of magnetic particles floating on the surface of fluid.

  14. Corrigendum to "Microstructural Characterization of Metal Foams: An Examination of the Applicability of the Theoretical Models for Modeling Foams"

    NASA Technical Reports Server (NTRS)

    Raj. Sai V.

    2011-01-01

    Establishing the geometry of foam cells is useful in developing microstructure-based acoustic and structural models. Since experimental data on the geometry of the foam cells are limited, most modeling efforts use an idealized three-dimensional, space-filling Kelvin tetrakaidecahedron. The validity of this assumption is investigated in the present paper. Several FeCrAlY foams with relative densities varying between 3 and 15% and cells per mm (c.p.mm.) varying between 0.2 and 3.9 c.p.mm. were microstructurally evaluated. The number of edges per face for each foam specimen was counted by approximating the cell faces by regular polygons, where the number of cell faces measured varied between 207 and 745. The present observations revealed that 50-57% of the cell faces were pentagonal while 24-28% were quadrilateral and 15-22% were hexagonal. The present measurements are shown to be in excellent agreement with literature data. It is demonstrated that the Kelvin model, as well as other proposed theoretical models, cannot accurately describe the FeCrAlY foam cell structure. Instead, it is suggested that the ideal foam cell geometry consists of 11 faces with 3 quadrilateral, 6 pentagonal faces and 2 hexagonal faces consistent with the 3-6-2 Matzke cell

  15. Theoretical model for a Faraday anomalous dispersion optical filter

    NASA Technical Reports Server (NTRS)

    Yin, B.; Shay, T. M.

    1991-01-01

    A model for the Faraday anomalous dispersion optical filter is presented. The model predicts a bandwidth of 0.6 GHz and a transmission peak of 0.98 for a filter operating on the Cs (D2) line. The model includes hyperfine effects and is valid for arbitrary magnetic fields.

  16. Spectrum analysis of radar life signal in the three kinds of theoretical models

    NASA Astrophysics Data System (ADS)

    Yang, X. F.; Ma, J. F.; Wang, D.

    2017-02-01

    In the single frequency continuous wave radar life detection system, based on the Doppler effect, the theory model of radar life signal is expressed by the real function, and there is a phenomenon that can't be confirmed by the experiment. When the phase generated by the distance between the measured object and the radar measuring head is л of integer times, the main frequency spectrum of life signal (respiration and heartbeat) is not existed in radar life signal. If this phase is л/2 of odd times, the main frequency spectrum of breath and heartbeat frequency is the strongest. In this paper, we use the Doppler effect as the basic theory, using three different mathematical expressions——real function, complex exponential function and Bessel's function expansion form. They are used to establish the theoretical model of radar life signal. Simulation analysis revealed that the Bessel expansion form theoretical model solve the problem of real function form. Compared with the theoretical model of the complex exponential function, the derived spectral line is greatly reduced in the theoretical model of Bessel expansion form, which is more consistent with the actual situation.

  17. Healing from Childhood Sexual Abuse: A Theoretical Model

    ERIC Educational Resources Information Center

    Draucker, Claire Burke; Martsolf, Donna S.; Roller, Cynthia; Knapik, Gregory; Ross, Ratchneewan; Stidham, Andrea Warner

    2011-01-01

    Childhood sexual abuse is a prevalent social and health care problem. The processes by which individuals heal from childhood sexual abuse are not clearly understood. The purpose of this study was to develop a theoretical model to describe how adults heal from childhood sexual abuse. Community recruitment for an ongoing broader project on sexual…

  18. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  19. Couples coping with cancer: exploration of theoretical frameworks from dyadic studies.

    PubMed

    Regan, Tim W; Lambert, Sylvie D; Kelly, Brian; Falconier, Mariana; Kissane, David; Levesque, Janelle V

    2015-12-01

    A diagnosis of cancer and subsequent treatment are distressing not only for the person directly affected, but also for their intimate partner. The aim of this review is to (a) identify the main theoretical frameworks underpinning research addressing dyadic coping among couples affected by cancer, (b) summarise the evidence supporting the concepts described in these theoretical frameworks, and (c) examine the similarities and differences between these theoretical perspectives. A literature search was undertaken to identify descriptive studies published between 1990 and 2013 (English and French) that examined the interdependence of patients' and partners' coping, and the impact of coping on psychosocial outcomes. Data were extracted using a standardised form and reviewed by three of the authors. Twenty-three peer-reviewed manuscripts were identified, from which seven theoretical perspectives were derived: Relationship-Focused Coping, Transactional Model of Stress and Coping, Systemic-Transactional Model (STM) of dyadic coping, Collaborative Coping, Relationship Intimacy model, Communication models, and Coping Congruence. Although these theoretical perspectives emphasised different aspects of coping, a number of conceptual commonalities were noted. This review identified key theoretical frameworks of dyadic coping used in cancer. Evidence indicates that responses within the couple that inhibit open communication between partner and patient are likely to have an adverse impact on psychosocial outcomes. Models that incorporate the interdependence of emotional responses and coping behaviours within couples have an emerging evidence base in psycho-oncology and may have greatest validity and clinical utility in this setting. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  1. Predictability analysis and validation of a low-dimensional model - an application to the dynamics of cereal crops observed from satellite

    NASA Astrophysics Data System (ADS)

    Mangiarotti, Sylvain; Drapeau, Laurent

    2013-04-01

    The global modeling approach aims to obtain parsimonious models of observed dynamics from few or single time series (Letellier et al. 2009). Specific algorithms were developed and validated for this purpose (Mangiarotti et al. 2012a). This approach was applied to the dynamics of cereal crops in semi-arid region using the vegetation index derived from satellite data as a proxy of the dynamics. A low-dimensional autonomous model could be obtained. The corresponding attractor is characteristic of weakly dissipative chaos and exhibits a toroidal-like structure. At present, only few theoretical cases of such chaos are known, and none was obtained from real world observations. Under smooth conditions, a robust validation of three-dimensional chaotic models can be usually performed based on the topological approach (Gilmore 1998). Such approach becomes more difficult for weakly dissipative systems, and almost impossible under noisy observational conditions. For this reason, another validation approach is developed which consists in comparing the forecasting skill of the model to other forecasts for which no dynamical model is required. A data assimilation process is associated to the model to estimate the model's skill; several schemes are tested (simple re-initialization, Extended and Ensemble Kalman Filters and Back and Forth Nudging). Forecasts without model are performed based on the search of analogous states in the phase space (Mangiarotti et al. 2012b). The comparison reveals the quality of the model's forecasts at short to moderate horizons and contributes to validate the model. These results suggest that the dynamics of cereal crops can be reasonably approximated by low-dimensional chaotic models, and also bring out powerful arguments for chaos. Chaotic models have often been used as benchmark to test data assimilation schemes; the present work shows that such tests may not only have a theoretical interest, but also almost direct applicative potential. Moreover

  2. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    PubMed

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  3. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification

    PubMed Central

    Baczyńska, Anna K.; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach’s alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed. PMID:27014111

  4. Towards a theoretical model on medicines as a health need.

    PubMed

    Vargas-Peláez, Claudia Marcela; Soares, Luciano; Rover, Marina Raijche Mattozo; Blatt, Carine Raquel; Mantel-Teeuwisse, Aukje; Rossi Buenaventura, Francisco Augusto; Restrepo, Luis Guillermo; Latorre, María Cristina; López, José Julián; Bürgin, María Teresa; Silva, Consuelo; Leite, Silvana Nair; Mareni Rocha, Farias

    2017-04-01

    Medicines are considered one of the main tools of western medicine to resolve health problems. Currently, medicines represent an important share of the countries' healthcare budget. In the Latin America region, access to essential medicines is still a challenge, although countries have established some measures in the last years in order to guarantee equitable access to medicines. A theoretical model is proposed for analysing the social, political, and economic factors that modulate the role of medicines as a health need and their influence on the accessibility and access to medicines. The model was built based on a narrative review about health needs, and followed the conceptual modelling methodology for theory-building. The theoretical model considers elements (stakeholders, policies) that modulate the perception towards medicines as a health need from two perspectives - health and market - at three levels: international, national and local levels. The perception towards medicines as a health need is described according to Bradshaw's categories: felt need, normative need, comparative need and expressed need. When those different categories applied to medicines coincide, the patients get access to the medicines they perceive as a need, but when the categories do not coincide, barriers to access to medicines are created. Our theoretical model, which holds a broader view about the access to medicines, emphasises how power structures, interests, interdependencies, values and principles of the stakeholders could influence the perception towards medicines as a health need and the access to medicines in Latin American countries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  6. Model Validation | Center for Cancer Research

    Cancer.gov

    Research Investigation and Animal Model Validation This activity is also under development and thus far has included increasing pathology resources, delivering pathology services, as well as using imaging and surgical methods to develop and refine animal models in collaboration with other CCR investigators.

  7. Quantitative model validation of manipulative robot systems

    NASA Astrophysics Data System (ADS)

    Kartowisastro, Iman Herwidiana

    This thesis is concerned with applying the distortion quantitative validation technique to a robot manipulative system with revolute joints. Using the distortion technique to validate a model quantitatively, the model parameter uncertainties are taken into account in assessing the faithfulness of the model and this approach is relatively more objective than the commonly visual comparison method. The industrial robot is represented by the TQ MA2000 robot arm. Details of the mathematical derivation of the distortion technique are given which explains the required distortion of the constant parameters within the model and the assessment of model adequacy. Due to the complexity of a robot model, only the first three degrees of freedom are considered where all links are assumed rigid. The modelling involves the Newton-Euler approach to obtain the dynamics model, and the Denavit-Hartenberg convention is used throughout the work. The conventional feedback control system is used in developing the model. The system behavior to parameter changes is investigated as some parameters are redundant. This work is important so that the most important parameters to be distorted can be selected and this leads to a new term called the fundamental parameters. The transfer function approach has been chosen to validate an industrial robot quantitatively against the measured data due to its practicality. Initially, the assessment of the model fidelity criterion indicated that the model was not capable of explaining the transient record in term of the model parameter uncertainties. Further investigations led to significant improvements of the model and better understanding of the model properties. After several improvements in the model, the fidelity criterion obtained was almost satisfied. Although the fidelity criterion is slightly less than unity, it has been shown that the distortion technique can be applied in a robot manipulative system. Using the validated model, the importance of

  8. Economic analysis of model validation for a challenge problem

    DOE PAGES

    Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.

    2016-02-19

    It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less

  9. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  10. A Theoretical Model for the Practice of Residential Treatment.

    ERIC Educational Resources Information Center

    Miskimins, R. W.

    1990-01-01

    Presents theoretical model describing practice of psychiatric residential treatment for children and adolescents. Emphasis is on 40 practice principles, guiding concepts which dictate specific treatment techniques and administrative procedures for Southern Oregon Adolescent Study and Treatment Center. Groups principles into six clusters: program…

  11. [Validity evidence of the Health-Related Quality of Life for Drug Abusers Test based on the Biaxial Model of Addiction].

    PubMed

    Lozano, Oscar M; Rojas, Antonio J; Pérez, Cristino; González-Sáiz, Francisco; Ballesta, Rosario; Izaskun, Bilbao

    2008-05-01

    The aim of this work is to show evidence of the validity of the Health-Related Quality of Life for Drug Abusers Test (HRQoLDA Test). This test was developed to measure specific HRQoL for drugs abusers, within the theoretical addiction framework of the biaxial model. The sample comprised 138 patients diagnosed with opiate drug dependence. In this study, the following constructs and variables of the biaxial model were measured: severity of dependence, physical health status, psychological adjustment and substance consumption. Results indicate that the HRQoLDA Test scores are related to dependency and consumption-related problems. Multiple regression analysis reveals that HRQoL can be predicted from drug dependence, physical health status and psychological adjustment. These results contribute empirical evidence of the theoretical relationships established between HRQoL and the biaxial model, and they support the interpretation of the HRQoLDA Test to measure HRQoL in drug abusers, thus providing a test to measure this specific construct in this population.

  12. Theoretical models for supercritical fluid extraction.

    PubMed

    Huang, Zhen; Shi, Xiao-Han; Jiang, Wei-Juan

    2012-08-10

    For the proper design of supercritical fluid extraction processes, it is essential to have a sound knowledge of the mass transfer mechanism of the extraction process and the appropriate mathematical representation. In this paper, the advances and applications of kinetic models for describing supercritical fluid extraction from various solid matrices have been presented. The theoretical models overviewed here include the hot ball diffusion, broken and intact cell, shrinking core and some relatively simple models. Mathematical representations of these models have been in detail interpreted as well as their assumptions, parameter identifications and application examples. Extraction process of the analyte solute from the solid matrix by means of supercritical fluid includes the dissolution of the analyte from the solid, the analyte diffusion in the matrix and its transport to the bulk supercritical fluid. Mechanisms involved in a mass transfer model are discussed in terms of external mass transfer resistance, internal mass transfer resistance, solute-solid interactions and axial dispersion. The correlations of the external mass transfer coefficient and axial dispersion coefficient with certain dimensionless numbers are also discussed. Among these models, the broken and intact cell model seems to be the most relevant mathematical model as it is able to provide realistic description of the plant material structure for better understanding the mass-transfer kinetics and thus it has been widely employed for modeling supercritical fluid extraction of natural matters. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Culture and Developmental Trajectories: A Discussion on Contemporary Theoretical Models

    ERIC Educational Resources Information Center

    de Carvalho, Rafael Vera Cruz; Seidl-de-Moura, Maria Lucia; Martins, Gabriela Dal Forno; Vieira, Mauro Luís

    2014-01-01

    This paper aims to describe, compare and discuss the theoretical models proposed by Patricia Greenfield, Çigdem Kagitçibasi and Heidi Keller. Their models have the common goal of understanding the developmental trajectories of self based on dimensions of autonomy and relatedness that are structured according to specific cultural and environmental…

  14. Validating the Mexican American Intergenerational Caregiving Model

    ERIC Educational Resources Information Center

    Escandon, Socorro

    2011-01-01

    The purpose of this study was to substantiate and further develop a previously formulated conceptual model of Role Acceptance in Mexican American family caregivers by exploring the theoretical strengths of the model. The sample consisted of women older than 21 years of age who self-identified as Hispanic, were related through consanguinal or…

  15. A theoretical model of water and trade

    NASA Astrophysics Data System (ADS)

    Dang, Qian; Konar, Megan; Reimer, Jeffrey J.; Di Baldassarre, Giuliano; Lin, Xiaowen; Zeng, Ruijie

    2016-03-01

    Water is an essential input for agricultural production. Agriculture, in turn, is globalized through the trade of agricultural commodities. In this paper, we develop a theoretical model that emphasizes four tradeoffs involving water-use decision-making that are important yet not always considered in a consistent framework. One tradeoff focuses on competition for water among different economic sectors. A second tradeoff examines the possibility that certain types of agricultural investments can offset water use. A third tradeoff explores the possibility that the rest of the world can be a source of supply or demand for a country's water-using commodities. The fourth tradeoff concerns how variability in water supplies influences farmer decision-making. We show conditions under which trade liberalization affect water use. Two policy scenarios to reduce water use are evaluated. First, we derive a target tax that reduces water use without offsetting the gains from trade liberalization, although important tradeoffs exist between economic performance and resource use. Second, we show how subsidization of water-saving technologies can allow producers to use less water without reducing agricultural production, making such subsidization an indirect means of influencing water use decision-making. Finally, we outline conditions under which riskiness of water availability affects water use. These theoretical model results generate hypotheses that can be tested empirically in future work.

  16. Category-theoretic models of algebraic computer systems

    NASA Astrophysics Data System (ADS)

    Kovalyov, S. P.

    2016-01-01

    A computer system is said to be algebraic if it contains nodes that implement unconventional computation paradigms based on universal algebra. A category-based approach to modeling such systems that provides a theoretical basis for mapping tasks to these systems' architecture is proposed. The construction of algebraic models of general-purpose computations involving conditional statements and overflow control is formally described by a reflector in an appropriate category of algebras. It is proved that this reflector takes the modulo ring whose operations are implemented in the conventional arithmetic processors to the Łukasiewicz logic matrix. Enrichments of the set of ring operations that form bases in the Łukasiewicz logic matrix are found.

  17. Real-time remote scientific model validation

    NASA Technical Reports Server (NTRS)

    Frainier, Richard; Groleau, Nicolas

    1994-01-01

    This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

  18. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  19. Theoretical modeling and experimental analysis of solar still integrated with evacuated tubes

    NASA Astrophysics Data System (ADS)

    Panchal, Hitesh; Awasthi, Anuradha

    2017-06-01

    In this present research work, theoretical modeling of single slope, single basin solar still integrated with evacuated tubes has been performed based on energy balance equations. Major variables like water temperature, inner glass cover temperature and distillate output has been computed based on theoretical modeling. The experimental setup has been made from locally available materials and installed at Gujarat Power Engineering and Research Institute, Mehsana, Gujarat, India (23.5880°N, 72.3693°E) with 0.04 m depth during 6 months of time interval. From the series of experiments, it is found considerable increment in average distillate output of a solar still when integrated with evacuated tubes not only during daytime but also from night time. In all experimental cases, the correlation of coefficient (r) and root mean square percentage deviation of theoretical modeling and experimental study found good agreement with 0.97 < r < 0.98 and 10.22 < e < 38.4% respectively.

  20. A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.

    2018-06-01

    We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.

  1. A Model of Resource Allocation in Public School Districts: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Chambers, Jay G.

    This paper formulates a comprehensive model of resource allocation in a local public school district. The theoretical framework specified could be applied equally well to any number of local public social service agencies. Section 1 develops the theoretical model describing the process of resource allocation. This involves the determination of the…

  2. Validation of Model Forecasts of the Ambient Solar Wind

    NASA Technical Reports Server (NTRS)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  3. Experimental Validation of a Branched Solution Model for Magnetosonic Ionization Waves in Plasma Accelerators

    NASA Astrophysics Data System (ADS)

    Underwood, Thomas; Loebner, Keith; Cappelli, Mark

    2015-11-01

    Detailed measurements of the thermodynamic and electrodynamic plasma state variables within the plume of a pulsed plasma accelerator are presented. A quadruple Langmuir probe operating in current-saturation mode is used to obtain time resolved measurements of the plasma density, temperature, potential, and velocity along the central axis of the accelerator. This data is used in conjunction with a fast-framing, intensified CCD camera to develop and validate a model predicting the existence of two distinct types of ionization waves corresponding to the upper and lower solution branches of the Hugoniot curve. A deviation of less than 8% is observed between the quasi-steady, one-dimensional theoretical model and the experimentally measured plume velocity. This work is supported by the U.S. Department of Energy Stewardship Science Academic Program in addition to the National Defense Science Engineering Graduate Fellowship.

  4. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    PubMed

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  5. Theoretical study of gas hydrate decomposition kinetics--model development.

    PubMed

    Windmeier, Christoph; Oellrich, Lothar R

    2013-10-10

    In order to provide an estimate of the order of magnitude of intrinsic gas hydrate dissolution and dissociation kinetics, the "Consecutive Desorption and Melting Model" (CDM) is developed by applying only theoretical considerations. The process of gas hydrate decomposition is assumed to comprise two consecutive and repetitive quasi chemical reaction steps. These are desorption of the guest molecule followed by local solid body melting. The individual kinetic steps are modeled according to the "Statistical Rate Theory of Interfacial Transport" and the Wilson-Frenkel approach. All missing required model parameters are directly linked to geometric considerations and a thermodynamic gas hydrate equilibrium model.

  6. A Game Theoretic Model of Thermonuclear Cyberwar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soper, Braden C.

    In this paper we propose a formal game theoretic model of thermonuclear cyberwar based on ideas found in [1] and [2]. Our intention is that such a game will act as a first step toward building more complete formal models of Cross-Domain Deterrence (CDD). We believe the proposed thermonuclear cyberwar game is an ideal place to start on such an endeavor because the game can be fashioned in a way that is closely related to the classical models of nuclear deterrence [4–6], but with obvious modifications that will help to elucidate the complexities introduced by a second domain. We startmore » with the classical bimatrix nuclear deterrence game based on the game of chicken, but introduce uncertainty via a left-of-launch cyber capability that one or both players may possess.« less

  7. A theoretical model to describe progressions and regressions for exercise rehabilitation.

    PubMed

    Blanchard, Sam; Glasgow, Phil

    2014-08-01

    This article aims to describe a new theoretical model to simplify and aid visualisation of the clinical reasoning process involved in progressing a single exercise. Exercise prescription is a core skill for physiotherapists but is an area that is lacking in theoretical models to assist clinicians when designing exercise programs to aid rehabilitation from injury. Historical models of periodization and motor learning theories lack any visual aids to assist clinicians. The concept of the proposed model is that new stimuli can be added or exchanged with other stimuli, either intrinsic or extrinsic to the participant, in order to gradually progress an exercise whilst remaining safe and effective. The proposed model maintains the core skills of physiotherapists by assisting clinical reasoning skills, exercise prescription and goal setting. It is not limited to any one pathology or rehabilitation setting and can adapted by any level of skilled clinician. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Theoretical models for application in school health education research.

    PubMed

    Parcel, G S

    1984-01-01

    Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.

  9. Reality-Theoretical Models-Mathematics: A Ternary Perspective on Physics Lessons in Upper-Secondary School

    ERIC Educational Resources Information Center

    Hansson, Lena; Hansson, Örjan; Juter, Kristina; Redfors, Andreas

    2015-01-01

    This article discusses the role of mathematics during physics lessons in upper-secondary school. Mathematics is an inherent part of theoretical models in physics and makes powerful predictions of natural phenomena possible. Ability to use both theoretical models and mathematics is central in physics. This paper takes as a starting point that the…

  10. Particle Engulfment and Pushing By Solidifying Interfaces - Recent Theoretical and Experimental Developments

    NASA Technical Reports Server (NTRS)

    Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.

    2003-01-01

    The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.

  11. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect

  12. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  13. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  14. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services.

  15. Validation of Magnetospheric Magnetohydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  16. Theoretical Analysis and Design of Ultrathin Broadband Optically Transparent Microwave Metamaterial Absorbers

    PubMed Central

    Deng, Ruixiang; Li, Meiling; Muneer, Badar; Zhu, Qi; Shi, Zaiying; Song, Lixin; Zhang, Tao

    2018-01-01

    Optically Transparent Microwave Metamaterial Absorber (OTMMA) is of significant use in both civil and military field. In this paper, equivalent circuit model is adopted as springboard to navigate the design of OTMMA. The physical model and absorption mechanisms of ideal lightweight ultrathin OTMMA are comprehensively researched. Both the theoretical value of equivalent resistance and the quantitative relation between the equivalent inductance and equivalent capacitance are derived for design. Frequency-dependent characteristics of theoretical equivalent resistance are also investigated. Based on these theoretical works, an effective and controllable design approach is proposed. To validate the approach, a wideband OTMMA is designed, fabricated, analyzed and tested. The results reveal that high absorption more than 90% can be achieved in the whole 6~18 GHz band. The fabricated OTMMA also has an optical transparency up to 78% at 600 nm and is much thinner and lighter than its counterparts. PMID:29324686

  17. Theoretical Analysis and Design of Ultrathin Broadband Optically Transparent Microwave Metamaterial Absorbers.

    PubMed

    Deng, Ruixiang; Li, Meiling; Muneer, Badar; Zhu, Qi; Shi, Zaiying; Song, Lixin; Zhang, Tao

    2018-01-11

    Optically Transparent Microwave Metamaterial Absorber (OTMMA) is of significant use in both civil and military field. In this paper, equivalent circuit model is adopted as springboard to navigate the design of OTMMA. The physical model and absorption mechanisms of ideal lightweight ultrathin OTMMA are comprehensively researched. Both the theoretical value of equivalent resistance and the quantitative relation between the equivalent inductance and equivalent capacitance are derived for design. Frequency-dependent characteristics of theoretical equivalent resistance are also investigated. Based on these theoretical works, an effective and controllable design approach is proposed. To validate the approach, a wideband OTMMA is designed, fabricated, analyzed and tested. The results reveal that high absorption more than 90% can be achieved in the whole 6~18 GHz band. The fabricated OTMMA also has an optical transparency up to 78% at 600 nm and is much thinner and lighter than its counterparts.

  18. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  19. Validity of empirical models of exposure in asphalt paving

    PubMed Central

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  20. Decision support models for solid waste management: Review and game-theoretic approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less

  1. Toward a Theoretical Model of Employee Turnover: A Human Resource Development Perspective

    ERIC Educational Resources Information Center

    Peterson, Shari L.

    2004-01-01

    This article sets forth the Organizational Model of Employee Persistence, influenced by traditional turnover models and a student attrition model. The model was developed to clarify the impact of organizational practices on employee turnover from a human resource development (HRD) perspective and provide a theoretical foundation for research on…

  2. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  3. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  4. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  5. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  6. Validation of the Poisson Stochastic Radiative Transfer Model

    NASA Technical Reports Server (NTRS)

    Zhuravleva, Tatiana; Marshak, Alexander

    2004-01-01

    A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

  7. Nursing management of sensory overload in psychiatry – Theoretical densification and modification of the framework model

    PubMed

    Scheydt, Stefan; Needham, Ian; Behrens, Johann

    2017-01-01

    Background: Within the scope of the research project on the subjects of sensory overload and stimulus regulation, a theoretical framework model of the nursing care of patients with sensory overload in psychiatry was developed. In a second step, this theoretical model should now be theoretically compressed and, if necessary, modified. Aim: Empirical verification as well as modification, enhancement and theoretical densification of the framework model of nursing care of patients with sensory overload in psychiatry. Method: Analysis of 8 expert interviews by summarizing and structuring content analysis methods based on Meuser and Nagel (2009) as well as Mayring (2010). Results: The developed framework model (Scheydt et al., 2016b) could be empirically verified, theoretically densificated and extended by one category (perception modulation). Thus, four categories of nursing care of patients with sensory overload can be described in inpatient psychiatry: removal from stimuli, modulation of environmental factors, perceptual modulation as well as help somebody to help him- or herself / coping support. Conclusions: Based on the methodological approach, a relatively well-saturated, credible conceptualization of a theoretical model for the description of the nursing care of patients with sensory overload in stationary psychiatry could be worked out. In further steps, these measures have to be further developed, implemented and evaluated regarding to their efficacy.

  8. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  9. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  10. Anticipatory Cognitive Systems: a Theoretical Model

    NASA Astrophysics Data System (ADS)

    Terenzi, Graziano

    This paper deals with the problem of understanding anticipation in biological and cognitive systems. It is argued that a physical theory can be considered as biologically plausible only if it incorporates the ability to describe systems which exhibit anticipatory behaviors. The paper introduces a cognitive level description of anticipation and provides a simple theoretical characterization of anticipatory systems on this level. Specifically, a simple model of a formal anticipatory neuron and a model (i.e. the τ-mirror architecture) of an anticipatory neural network which is based on the former are introduced and discussed. The basic feature of this architecture is that a part of the network learns to represent the behavior of the other part over time, thus constructing an implicit model of its own functioning. As a consequence, the network is capable of self-representation; anticipation, on a oscopic level, is nothing but a consequence of anticipation on a microscopic level. Some learning algorithms are also discussed together with related experimental tasks and possible integrations. The outcome of the paper is a formal characterization of anticipation in cognitive systems which aims at being incorporated in a comprehensive and more general physical theory.

  11. Experimental and theoretical investigations on the validity of the geometrical optics model for calculating the stability of optical traps.

    PubMed

    Schut, T C; Hesselink, G; de Grooth, B G; Greve, J

    1991-01-01

    We have developed a computer program based on the geometrical optics approach proposed by Roosen to calculate the forces on dielectric spheres in focused laser beams. We have explicitly taken into account the polarization of the laser light and thd divergence of the laser beam. The model can be used to evaluate the stability of optical traps in a variety of different optical configurations. Our calculations explain the experimental observation by Ashkin that a stable single-beam optical trap, without the help of the gravitation force, can be obtained with a strongly divergent laser beam. Our calculations also predict a different trap stability in the directions orthogonal and parallel to the polarization direction of the incident light. Different experimental methods were used to test the predictions of the model for the gravity trap. A new method for measuring the radiation force along the beam axis in both the stable and instable regions is presented. Measurements of the radiation force on polystyrene spheres with diameters of 7.5 and 32 microns in a TEM00-mode laser beam showed a good qualitative correlation with the predictions and a slight quantitative difference. The validity of the geometrical approximations involved in the model will be discussed for spheres of different sizes and refractive indices.

  12. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care.

    PubMed

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables-training, self-efficacy, system quality and information quality-on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time.

  13. The neural mediators of kindness-based meditation: a theoretical model

    PubMed Central

    Mascaro, Jennifer S.; Darcher, Alana; Negi, Lobsang T.; Raison, Charles L.

    2015-01-01

    Although kindness-based contemplative practices are increasingly employed by clinicians and cognitive researchers to enhance prosocial emotions, social cognitive skills, and well-being, and as a tool to understand the basic workings of the social mind, we lack a coherent theoretical model with which to test the mechanisms by which kindness-based meditation may alter the brain and body. Here, we link contemplative accounts of compassion and loving-kindness practices with research from social cognitive neuroscience and social psychology to generate predictions about how diverse practices may alter brain structure and function and related aspects of social cognition. Contingent on the nuances of the practice, kindness-based meditation may enhance the neural systems related to faster and more basic perceptual or motor simulation processes, simulation of another’s affective body state, slower and higher-level perspective-taking, modulatory processes such as emotion regulation and self/other discrimination, and combinations thereof. This theoretical model will be discussed alongside best practices for testing such a model and potential implications and applications of future work. PMID:25729374

  14. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  15. Theoretical Evaluation Self-Test (Test): A Preliminary Validation Study

    ERIC Educational Resources Information Center

    Coleman, Daniel

    2004-01-01

    Over nearly 40 years, several scales have been developed to measure therapist theoretical orientation (Poznanski & McLennan, 1995). This study, unlike previous efforts, focuses on "community clinicians"--social workers and other mental health professionals (such as psychologists, counselors, psychiatrists, and psychiatric nurses) who…

  16. Validation of urban freeway models. [supporting datasets

    DOT National Transportation Integrated Search

    2015-01-01

    The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...

  17. Imitative Modeling as a Theoretical Base for Instructing Language-Disordered Children

    ERIC Educational Resources Information Center

    Courtright, John A.; Courtright, Illene C.

    1976-01-01

    A modification of A. Bandura's social learning theory (imitative modeling) was employed as a theoretical base for language instruction with eight language disordered children (5 to 10 years old). (Author/SBH)

  18. [Elaboration and validation of a tool to measure psychological well-being: WBMMS].

    PubMed

    Massé, R; Poulin, C; Dassa, C; Lambert, J; Bélair, S; Battaglini, M A

    1998-01-01

    Psychological well-being scales used in epidemiologic surveys usually show high construct validity. The content validation, however, is less convincing since these scales rest on lists of items that reflect the theoretical model of the authors. In this study we present results of the construct and criterion validation of a new Well-Being Manifestations Measure Scale (WBMMS) founded on an initial list of manifestations derived from an original content validation in a general population. It is concluded that national and public health epidemiologic surveys should include both measures of positive and negative mental health.

  19. A Theoretical Model for Thin Film Ferroelectric Coupled Microstripline Phase Shifters

    NASA Technical Reports Server (NTRS)

    Romanofsky, R. R.; Quereshi, A. H.

    2000-01-01

    Novel microwave phase shifters consisting of coupled microstriplines on thin ferroelectric films have been demonstrated recently. A theoretical model useful for predicting the propagation characteristics (insertion phase shift, dielectric loss, impedance, and bandwidth) is presented here. The model is based on a variational solution for line capacitance and coupled strip transmission line theory.

  20. Multiple Versus Single Set Validation of Multivariate Models to Avoid Mistakes.

    PubMed

    Harrington, Peter de Boves

    2018-01-02

    Validation of multivariate models is of current importance for a wide range of chemical applications. Although important, it is neglected. The common practice is to use a single external validation set for evaluation. This approach is deficient and may mislead investigators with results that are specific to the single validation set of data. In addition, no statistics are available regarding the precision of a derived figure of merit (FOM). A statistical approach using bootstrapped Latin partitions is advocated. This validation method makes an efficient use of the data because each object is used once for validation. It was reviewed a decade earlier but primarily for the optimization of chemometric models this review presents the reasons it should be used for generalized statistical validation. Average FOMs with confidence intervals are reported and powerful, matched-sample statistics may be applied for comparing models and methods. Examples demonstrate the problems with single validation sets.

  1. Graph theoretical model of a sensorimotor connectome in zebrafish.

    PubMed

    Stobb, Michael; Peterson, Joshua M; Mazzag, Borbala; Gahtan, Ethan

    2012-01-01

    Mapping the detailed connectivity patterns (connectomes) of neural circuits is a central goal of neuroscience. The best quantitative approach to analyzing connectome data is still unclear but graph theory has been used with success. We present a graph theoretical model of the posterior lateral line sensorimotor pathway in zebrafish. The model includes 2,616 neurons and 167,114 synaptic connections. Model neurons represent known cell types in zebrafish larvae, and connections were set stochastically following rules based on biological literature. Thus, our model is a uniquely detailed computational representation of a vertebrate connectome. The connectome has low overall connection density, with 2.45% of all possible connections, a value within the physiological range. We used graph theoretical tools to compare the zebrafish connectome graph to small-world, random and structured random graphs of the same size. For each type of graph, 100 randomly generated instantiations were considered. Degree distribution (the number of connections per neuron) varied more in the zebrafish graph than in same size graphs with less biological detail. There was high local clustering and a short average path length between nodes, implying a small-world structure similar to other neural connectomes and complex networks. The graph was found not to be scale-free, in agreement with some other neural connectomes. An experimental lesion was performed that targeted three model brain neurons, including the Mauthner neuron, known to control fast escape turns. The lesion decreased the number of short paths between sensory and motor neurons analogous to the behavioral effects of the same lesion in zebrafish. This model is expandable and can be used to organize and interpret a growing database of information on the zebrafish connectome.

  2. Outward Bound Outcome Model Validation and Multilevel Modeling

    ERIC Educational Resources Information Center

    Luo, Yuan-Chun

    2011-01-01

    This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

  3. Validation of Community Models: Identifying Events in Space Weather Model Timelines

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

  4. Speech chronemics--a hidden dimension of speech. Theoretical background, measurement and clinical validity.

    PubMed

    Krüger, H P

    1989-02-01

    The term "speech chronemics" is introduced to characterize a research strategy which extracts from the physical qualities of the speech signal only the pattern of ons ("speaking") and offs ("pausing"). The research in this field can be structured into the methodological dimension "unit of time", "number of speakers", and "quality of the prosodic measures". It is shown that a researcher's actual decision for one method largely determines the outcome of his study. Then, with the Logoport a new portable measurement device is presented. It enables the researcher to study speaking behavior over long periods of time (up to 24 hours) in the normal environment of his subjects. Two experiments are reported. The first shows the validity of articulation pauses for variations in the physiological state of the organism. The second study proves a new betablocking agent to have sociotropic effects: in a long-term trial socially high-strung subjects showed an improved interaction behavior (compared to placebo and socially easy-going persons) in their everyday life. Finally, the need for a comprehensive theoretical foundation and for standardization of measurement situations and methods is emphasized.

  5. TEST OF A THEORETICAL COMMUTER EXPOSURE MODEL TO VEHICLE EXHAUST IN TRAFFIC

    EPA Science Inventory

    A theoretical model of commuter exposure is presented as a box or cell model with the automobile passenger compartment representing the microenvironment exposed to CO concentrations resulting from vehicle exhaust leaks and emissions from traffic. Equations which describe this sit...

  6. Distance Education in Taiwan: A Model Validated.

    ERIC Educational Resources Information Center

    Shih, Mei-Yau; Zvacek, Susan M.

    The Triad Perspective Model of Distance Education (TPMDE) guides researchers in developing research questions, gathering data, and producing a comprehensive description of a distance education program. It was developed around three theoretical perspectives: (1) curriculum development theory (Tyler's four questions, 1949); (2) systems theory…

  7. An alternative theoretical model for an anomalous hollow beam.

    PubMed

    Cai, Yangjian; Wang, Zhaoying; Lin, Qiang

    2008-09-15

    An alternative and convenient theoretical model is proposed to describe a flexible anomalous hollow beam of elliptical symmetry with an elliptical solid core, which was observed in experiment recently (Phys. Rev. Lett, 94 (2005) 134802). In this model, the electric field of anomalous hollow beam is expressed as a finite sum of elliptical Gaussian modes. Flattopped beams, dark hollow beams and Gaussian beams are special cases of our model. Analytical propagation formulae for coherent and partially coherent anomalous hollow beams passing through astigmatic ABCD optical systems are derived. Some numerical examples are calculated to show the propagation and focusing properties of coherent and partially coherent anomalous hollow beams.

  8. Validating Computational Human Behavior Models: Consistency and Accuracy Issues

    DTIC Science & Technology

    2004-06-01

    includes a discussion of SME demographics, content, and organization of the datasets . This research generalizes data from two pilot studies and two base...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject

  9. Adolescent Personality: A Five-Factor Model Construct Validation

    ERIC Educational Resources Information Center

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.

    2004-01-01

    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  10. Theoretical modelling on thermal expansion of Al, Ag and Cu nanomaterials

    NASA Astrophysics Data System (ADS)

    Manu, Mehul; Dubey, Vikash

    2018-05-01

    A simple theoretical model is developed for the calculating the coefficient of volume thermal expansion (CTE) and volume thermal expansion (VTE) of Al, Ag and Cu nanomaterials by considering the cubo-octahedral structure with the change of temperature and the cluster size. At the room temperature, the coefficient of volume thermal expansion decreases sharply below 20-25 nm and the decrement of the coefficient of volume thermal expansion becomes slower above 20-25 nm. We also saw a variation in the volume thermal expansion with the variation of temperature and cluster size. At a fixed cluster size, the volume thermal expansion increases with an increase of temperature at below the melting temperature and show a linear relation of volume thermal expansion with the temperature. At a constant temperature, the volume thermal expansion decreases rapidly with an increase in cluster size below 20-25 nm and after 20-25 nm the decrement of volume thermal expansion becomes slower with the increase of the size of the cluster. Thermal expansion is due to the anharmonicity of the atom interaction. As the temperature rises the amplitude of crystal lattice vibration increases, but the equilibrium distance shifts as the atom spend more time at distance greater than the original spacing due as the repulsion at short distance greater than the corresponding attraction at farther distance. In considering the cubo- octahedral structure with the cluster order, the model prediction on the CTE and the VTE are in good agreement with the available experimental data which demonstrate the validity of our work.

  11. A theoretical model for smoking prevention studies in preteen children.

    PubMed

    McGahee, T W; Kemp, V; Tingen, M

    2000-01-01

    The age of the onset of smoking is on a continual decline, with the prime age of tobacco use initiation being 12-14 years. A weakness of the limited research conducted on smoking prevention programs designed for preteen children (ages 10-12) is a well-defined theoretical basis. A theoretical perspective is needed in order to make a meaningful transition from empirical analysis to application of knowledge. Bandura's Social Cognitive Theory (1977, 1986), the Theory of Reasoned Action (Ajzen & Fishbein, 1980), and other literature linking various concepts to smoking behaviors in preteens were used to develop a model that may be useful for smoking prevention studies in preteen children.

  12. Simple control-theoretic models of human steering activity in visually guided vehicle control

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.

    1991-01-01

    A simple control theoretic model of human steering or control activity in the lateral-directional control of vehicles such as automobiles and rotorcraft is discussed. The term 'control theoretic' is used to emphasize the fact that the model is derived from a consideration of well-known control system design principles as opposed to psychological theories regarding egomotion, etc. The model is employed to emphasize the 'closed-loop' nature of tasks involving the visually guided control of vehicles upon, or in close proximity to, the earth and to hypothesize how changes in vehicle dynamics can significantly alter the nature of the visual cues which a human might use in such tasks.

  13. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  14. A utility-theoretic model for QALYs and willingness to pay.

    PubMed

    Klose, Thomas

    2003-01-01

    Despite the widespread use of quality-adjusted life years (QALY) in economic evaluation studies, their utility-theoretic foundation remains unclear. A model for preferences over health, money, and time is presented in this paper. Under the usual assumptions of the original QALY-model, an additive separable presentation of the utilities in different periods exists. In contrast to the usual assumption that QALY-weights do solely depend on aspects of health-related quality of life, wealth-standardized QALY-weights might vary with the wealth level in the presented extension of the original QALY-model resulting in an inconsistent measurement of QALYs. Further assumptions are presented to make the measurement of QALYs consistent with lifetime preferences over health and money. Even under these strict assumptions, QALYs and WTP (which also can be defined in this utility-theoretic model) are not equivalent preference-based measures of the effects of health technologies on an individual level. The results suggest that the individual WTP per QALY can depend on the magnitude of the QALY-gain as well as on the disease burden, when health influences the marginal utility of wealth. Further research seems to be indicated on this structural aspect of preferences over health and wealth and to quantify its impact. Copyright 2002 John Wiley & Sons, Ltd.

  15. Development and application of theoretical models for Rotating Detonation Engine flowfields

    NASA Astrophysics Data System (ADS)

    Fievisohn, Robert

    As turbine and rocket engine technology matures, performance increases between successive generations of engine development are becoming smaller. One means of accomplishing significant gains in thermodynamic performance and power density is to use detonation-based heat release instead of deflagration. This work is focused on developing and applying theoretical models to aid in the design and understanding of Rotating Detonation Engines (RDEs). In an RDE, a detonation wave travels circumferentially along the bottom of an annular chamber where continuous injection of fresh reactants sustains the detonation wave. RDEs are currently being designed, tested, and studied as a viable option for developing a new generation of turbine and rocket engines that make use of detonation heat release. One of the main challenges in the development of RDEs is to understand the complex flowfield inside the annular chamber. While simplified models are desirable for obtaining timely performance estimates for design analysis, one-dimensional models may not be adequate as they do not provide flow structure information. In this work, a two-dimensional physics-based model is developed, which is capable of modeling the curved oblique shock wave, exit swirl, counter-flow, detonation inclination, and varying pressure along the inflow boundary. This is accomplished by using a combination of shock-expansion theory, Chapman-Jouguet detonation theory, the Method of Characteristics (MOC), and other compressible flow equations to create a shock-fitted numerical algorithm and generate an RDE flowfield. This novel approach provides a numerically efficient model that can provide performance estimates as well as details of the large-scale flow structures in seconds on a personal computer. Results from this model are validated against high-fidelity numerical simulations that may require a high-performance computing framework to provide similar performance estimates. This work provides a designer a new

  16. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    NASA Astrophysics Data System (ADS)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  17. Theoretical modeling of the catch-slip bond transition in biological adhesion

    NASA Astrophysics Data System (ADS)

    Gunnerson, Kim; Pereverzev, Yuriy; Prezhdo, Oleg

    2006-05-01

    The mechanism by which leukocytes leave the blood stream and enter inflamed tissue is called extravasation. This process is facilitated by the ability of selectin proteins, produced by the endothelial cells of blood vessels, to form transient bonds with the leukocytes. In the case of P-selectin, the protein bonds with P-selectin glycoprotein ligands (PSGL-1) produced by the leukocyte. Recent atomic force microscopy and flow chamber analyses of the binding of P-selectin to PSGL-1 provide evidence for an unusual biphasic catch-bond/slip-bond behavior in response to the strength of exerted force. This biphasic process is not well-understood. There are several theoretical models for describing this phenomenon. These models use different profiles for potential energy landscapes and how they change under forces. We are exploring these changes using molecular dynamics. We will present a simple theoretical model as well as share some of our early MD results for describing this phenomenon.

  18. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  19. Experimental and Theoretical Basis for a Closed-Form Spectral BRDF Model

    DTIC Science & Technology

    2015-09-17

    EXPERIMENTAL AND THEORETICAL BASIS FOR A CLOSED-FORM SPECTRAL BRDF MODEL DISSERTATION Samuel D. Butler, Major, USAF AFIT-ENP-DS-15-S-021 DEPARTMENT...SPECTRAL BRDF MODEL DISSERTATION Presented to the Faculty Graduate School of Engineering and Management Air Force Institute of Technology Air University Air...FOR A CLOSED-FORM SPECTRAL BRDF MODEL DISSERTATION Samuel D. Butler, BS, MS Major, USAF Committee Membership: Michael A. Marciniak, PhD Chairman Kevin

  20. Longitudinal Models of Reliability and Validity: A Latent Curve Approach.

    ERIC Educational Resources Information Center

    Tisak, John; Tisak, Marie S.

    1996-01-01

    Dynamic generalizations of reliability and validity that will incorporate longitudinal or developmental models, using latent curve analysis, are discussed. A latent curve model formulated to depict change is incorporated into the classical definitions of reliability and validity. The approach is illustrated with sociological and psychological…

  1. Construct Validity of the Autism Impact Measure (AIM).

    PubMed

    Mazurek, Micah O; Carlson, Coleen; Baker-Ericzén, Mary; Butter, Eric; Norris, Megan; Kanne, Stephen

    2018-01-17

    The Autism Impact Measure (AIM) was designed to track incremental change in frequency and impact of core ASD symptoms. The current study examined the structural and convergent validity of the AIM in a large sample of children with ASD. The results of a series of exploratory and confirmatory factor analyses yielded a final model with five theoretically and empirically meaningful subdomains: Repetitive Behavior, Atypical Behavior, Communication, Social Reciprocity, and Peer Interaction. The final model showed very good fit both overall and for each of the five factors, indicating excellent structural validity. AIM subdomain scores were significantly correlated with measures of similar constructs across all five domains. The results provide further support for the psychometric properties of the AIM.

  2. Redesigning Orientation in an Intensive Care Unit Using 2 Theoretical Models.

    PubMed

    Kozub, Elizabeth; Hibanada-Laserna, Maribel; Harget, Gwen; Ecoff, Laurie

    2015-01-01

    To accommodate a higher demand for critical care nurses, an orientation program in a surgical intensive care unit was revised and streamlined. Two theoretical models served as a foundation for the revision and resulted in clear clinical benchmarks for orientation progress evaluation. The purpose of the project was to integrate theoretical frameworks into practice to improve the unit orientation program. Performance improvement methods served as a framework for the revision, and outcomes were measured before and after implementation. The revised orientation program increased 1- and 2-year nurse retention and decreased turnover. Critical care knowledge increased after orientation for both the preintervention and postintervention groups. Incorporating a theoretical basis for orientation has been shown to be successful in increasing the number of nurses completing orientation and improving retention, turnover rates, and knowledge gained.

  3. Molecular dynamics simulations of theoretical cellulose nanotube models.

    PubMed

    Uto, Takuya; Kodama, Yuta; Miyata, Tatsuhiko; Yui, Toshifumi

    2018-06-15

    Nanotubes are remarkable nanoscale architectures for a wide range of potential applications. In the present paper, we report a molecular dynamics (MD) study of the theoretical cellulose nanotube (CelNT) models to evaluate their dynamic behavior in solution (either chloroform or benzene). Based on the one-quarter chain staggering relationship, we constructed six CelNT models by combining the two chain polarities (parallel (P) and antiparallel (AP)) and three symmetry operations (helical right (H R ), helical left (H L ), and rotation (R)) to generate a circular arrangement of molecular chains. Among the four models that retained the tubular form (P-H R , P-H L , P-R, and AP-R), the P-R and AP-R models have the lowest steric energies in benzene and chloroform, respectively. The structural features of the CelNT models were characterized in terms of the hydroxymethyl group conformation and intermolecular hydrogen bonds. Solvent structuring more clearly occurred with benzene than chloroform, suggesting that the CelNT models may disperse in benzene. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Microstructural Characterization of Metal Foams: An Examination of the Applicability of the Theoretical Models for Modeling Foams. Revision 1

    NASA Technical Reports Server (NTRS)

    Raj, S. V.

    2011-01-01

    Establishing the geometry of foam cells is useful in developing microstructure-based acoustic and structural models. Since experimental data on the geometry of the foam cells are limited, most modeling efforts use an idealized three-dimensional, space-filling Kelvin tetrakaidecahedron. The validity of this assumption is investigated in the present paper. Several FeCrAlY foams with relative densities varying between 3 and 15 percent and cells per mm (c.p.mm.) varying between 0.2 and 3.9 c.p.mm. were microstructurally evaluated. The number of edges per face for each foam specimen was counted by approximating the cell faces by regular polygons, where the number of cell faces measured varied between 207 and 745. The present observations revealed that 50 to 57 percent of the cell faces were pentagonal while 24 to 28 percent were quadrilateral and 15 to 22 percent were hexagonal. The present measurements are shown to be in excellent agreement with literature data. It is demonstrated that the Kelvin model, as well as other proposed theoretical models, cannot accurately describe the FeCrAlY foam cell structure. Instead, it is suggested that the ideal foam cell geometry consists of 11 faces with three quadrilateral, six pentagonal faces and two hexagonal faces consistent with the 3-6-2 Matzke cell. A compilation of 90 years of experimental data reveals that the average number of cell faces decreases linearly with the increasing ratio of quadrilateral to pentagonal faces. It is concluded that the Kelvin model is not supported by these experimental data.

  5. An Emerging Theoretical Model of Music Therapy Student Development.

    PubMed

    Dvorak, Abbey L; Hernandez-Ruiz, Eugenia; Jang, Sekyung; Kim, Borin; Joseph, Megan; Wells, Kori E

    2017-07-01

    Music therapy students negotiate a complex relationship with music and its use in clinical work throughout their education and training. This distinct, pervasive, and evolving relationship suggests a developmental process unique to music therapy. The purpose of this grounded theory study was to create a theoretical model of music therapy students' developmental process, beginning with a study within one large Midwestern university. Participants (N = 15) were music therapy students who completed one 60-minute intensive interview, followed by a 20-minute member check meeting. Recorded interviews were transcribed, analyzed, and coded using open and axial coding. The theoretical model that emerged was a six-step sequential developmental progression that included the following themes: (a) Personal Connection, (b) Turning Point, (c) Adjusting Relationship with Music, (d) Growth and Development, (e) Evolution, and (f) Empowerment. The first three steps are linear; development continues in a cyclical process among the last three steps. As the cycle continues, music therapy students continue to grow and develop their skills, leading to increased empowerment, and more specifically, increased self-efficacy and competence. Further exploration of the model is needed to inform educators' and other key stakeholders' understanding of student needs and concerns as they progress through music therapy degree programs. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  7. The Theoretical Basis of the Effective School Improvement Model (ESI)

    ERIC Educational Resources Information Center

    Scheerens, Jaap; Demeuse, Marc

    2005-01-01

    This article describes the process of theoretical reflection that preceded the development and empirical verification of a model of "effective school improvement". The focus is on basic mechanisms that could be seen as underlying "getting things in motion" and change in education systems. Four mechanisms are distinguished:…

  8. Path Analysis Tests of Theoretical Models of Children's Memory Performance

    ERIC Educational Resources Information Center

    DeMarie, Darlene; Miller, Patricia H.; Ferron, John; Cunningham, Walter R.

    2004-01-01

    Path analysis was used to test theoretical models of relations among variables known to predict differences in children's memory--strategies, capacity, and metamemory. Children in kindergarten to fourth grade (chronological ages 5 to 11) performed different memory tasks. Several strategies (i.e., sorting, clustering, rehearsal, and self-testing)…

  9. A Game-Theoretic Model of Grounding for Referential Communication Tasks

    ERIC Educational Resources Information Center

    Thompson, William

    2009-01-01

    Conversational grounding theory proposes that language use is a form of rational joint action, by which dialog participants systematically and collaboratively add to their common ground of shared knowledge and beliefs. Following recent work applying "game theory" to pragmatics, this thesis develops a game-theoretic model of grounding that…

  10. E-Learning Systems Support of Collaborative Agreements: A Theoretical Model

    ERIC Educational Resources Information Center

    Aguirre, Sandra; Quemada, Juan

    2012-01-01

    This paper introduces a theoretical model for developing integrated degree programmes through e-learning systems as stipulated by a collaboration agreement signed by two universities. We have analysed several collaboration agreements between universities at the national, European, and transatlantic level as well as various e-learning frameworks. A…

  11. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care

    PubMed Central

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables—training, self-efficacy, system quality and information quality—on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time. PMID:29315323

  12. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  13. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model.

    PubMed

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.

  14. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model

    PubMed Central

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567

  15. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  16. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  17. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  18. Experience of validation and tuning of turbulence models as applied to the problem of boundary layer separation on a finite-width wedge

    NASA Astrophysics Data System (ADS)

    Babulin, A. A.; Bosnyakov, S. M.; Vlasenko, V. V.; Engulatova, M. F.; Matyash, S. V.; Mikhailov, S. V.

    2016-06-01

    Modern differential turbulence models are validated by computing a separation zone generated in the supersonic flow past a compression wedge lying on a plate of finite width. The results of three- and two-dimensional computations based on the ( q-ω), SST, and Spalart-Allmaras turbulence models are compared with experimental data obtained for 8°, 25°, and 45° wedges by A.A. Zheltovodov at the Institute of Theoretical and Applied Mechanics of the Siberian Branch of the Russian Academy of Sciences. An original law-of-the-wall boundary condition and modifications of the SST model intended for improving the quality of the computed separation zone are described.

  19. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    NASA Astrophysics Data System (ADS)

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-10-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.

  20. Modeling and validating the cost and clinical pathway of colorectal cancer.

    PubMed

    Joranger, Paal; Nesbakken, Arild; Hoff, Geir; Sorbye, Halfdan; Oshaug, Arne; Aas, Eline

    2015-02-01

    Cancer is a major cause of morbidity and mortality, and colorectal cancer (CRC) is the third most common cancer in the world. The estimated costs of CRC treatment vary considerably, and if CRC costs in a model are based on empirically estimated total costs of stage I, II, III, or IV treatments, then they lack some flexibility to capture future changes in CRC treatment. The purpose was 1) to describe how to model CRC costs and survival and 2) to validate the model in a transparent and reproducible way. We applied a semi-Markov model with 70 health states and tracked age and time since specific health states (using tunnels and 3-dimensional data matrix). The model parameters are based on an observational study at Oslo University Hospital (2049 CRC patients), the National Patient Register, literature, and expert opinion. The target population was patients diagnosed with CRC. The model followed the patients diagnosed with CRC from the age of 70 until death or 100 years. The study focused on the perspective of health care payers. The model was validated for face validity, internal and external validity, and cross-validity. The validation showed a satisfactory match with other models and empirical estimates for both cost and survival time, without any preceding calibration of the model. The model can be used to 1) address a range of CRC-related themes (general model) like survival and evaluation of the cost of treatment and prevention measures; 2) make predictions from intermediate to final outcomes; 3) estimate changes in resource use and costs due to changing guidelines; and 4) adjust for future changes in treatment and trends over time. The model is adaptable to other populations. © The Author(s) 2014.

  1. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    2012-01-01

    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  2. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  3. Validation of recent geopotential models in Tierra Del Fuego

    NASA Astrophysics Data System (ADS)

    Gomez, Maria Eugenia; Perdomo, Raul; Del Cogliano, Daniel

    2017-10-01

    This work presents a validation study of global geopotential models (GGM) in the region of Fagnano Lake, located in the southern Andes. This is an excellent area for this type of validation because it is surrounded by the Andes Mountains, and there is no terrestrial gravity or GNSS/levelling data. However, there are mean lake level (MLL) observations, and its surface is assumed to be almost equipotential. Furthermore, in this article, we propose improved geoid solutions through the Residual Terrain Modelling (RTM) approach. Using a global geopotential model, the results achieved allow us to conclude that it is possible to use this technique to extend an existing geoid model to those regions that lack any information (neither gravimetric nor GNSS/levelling observations). As GGMs have evolved, our results have improved progressively. While the validation of EGM2008 with MLL data shows a standard deviation of 35 cm, GOCO05C shows a deviation of 13 cm, similar to the results obtained on land.

  4. Modeling and validation of spectral BRDF on material surface of space target

    NASA Astrophysics Data System (ADS)

    Hou, Qingyu; Zhi, Xiyang; Zhang, Huili; Zhang, Wei

    2014-11-01

    The modeling and the validation methods of the spectral BRDF on the material surface of space target were presented. First, the microscopic characteristics of the space targets' material surface were analyzed based on fiber-optic spectrometer using to measure the direction reflectivity of the typical materials surface. To determine the material surface of space target is isotropic, atomic force microscopy was used to measure the material surface structure of space target and obtain Gaussian distribution model of microscopic surface element height. Then, the spectral BRDF model based on that the characteristics of the material surface were isotropic and the surface micro-facet with the Gaussian distribution which we obtained was constructed. The model characterizes smooth and rough surface well for describing the material surface of the space target appropriately. Finally, a spectral BRDF measurement platform in a laboratory was set up, which contains tungsten halogen lamp lighting system, fiber optic spectrometer detection system and measuring mechanical systems with controlling the entire experimental measurement and collecting measurement data by computers automatically. Yellow thermal control material and solar cell were measured with the spectral BRDF, which showed the relationship between the reflection angle and BRDF values at three wavelengths in 380nm, 550nm, 780nm, and the difference between theoretical model values and the measured data was evaluated by relative RMS error. Data analysis shows that the relative RMS error is less than 6%, which verified the correctness of the spectral BRDF model.

  5. Comparison and validation of point spread models for imaging in natural waters.

    PubMed

    Hou, Weilin; Gray, Deric J; Weidemann, Alan D; Arnone, Robert A

    2008-06-23

    It is known that scattering by particulates within natural waters is the main cause of the blur in underwater images. Underwater images can be better restored or enhanced with knowledge of the point spread function (PSF) of the water. This will extend the performance range as well as the information retrieval from underwater electro-optical systems, which is critical in many civilian and military applications, including target and especially mine detection, search and rescue, and diver visibility. A better understanding of the physical process involved also helps to predict system performance and simulate it accurately on demand. The presented effort first reviews several PSF models, including the introduction of a semi-analytical PSF given optical properties of the medium, including scattering albedo, mean scattering angles and the optical range. The models under comparison include the empirical model of Duntley, a modified PSF model by Dolin et al, as well as the numerical integration of analytical forms from Wells, as a benchmark of theoretical results. For experimental results, in addition to that of Duntley, we validate the above models with measured point spread functions by applying field measured scattering properties with Monte Carlo simulations. Results from these comparisons suggest it is sufficient but necessary to have the three parameters listed above to model PSFs. The simplified approach introduced also provides adequate accuracy and flexibility for imaging applications, as shown by examples of restored underwater images.

  6. Control Theoretic Modeling and Generated Flow Patterns of a Fish-Tail Robot

    NASA Astrophysics Data System (ADS)

    Massey, Brian; Morgansen, Kristi; Dabiri, Dana

    2003-11-01

    Many real-world engineering problems involve understanding and manipulating fluid flows. One of the challenges to further progress in the area of active flow control is the lack of appropriate models that are amenable to control-theoretic studies and algorithm design and also incorporate reasonably realistic fluid dynamic effects. We focus here on modeling and model-verification of bio-inspired actuators (fish-fin type structures) used to control fluid dynamic artifacts that will affect speed, agility, and stealth of Underwater Autonomous Vehicles (UAVs). Vehicles using fish-tail type systems are more maneuverable, can turn in much shorter and more constrained spaces, have lower drag, are quieter and potentially more efficient than those using propellers. We will present control-theoretic models for a simple prototype coupled fluid and mechanical actuator where fluid effects are crudely modeled by assuming only lift, drag, and added mass, while neglecting boundary effects. These models will be tested with different control input parameters on an experimental fish-tail robot with the resulting flow captured with DPIV. Relations between the model, the control function choices, the obtained thrust and drag, and the corresponding flow patterns will be presented and discussed.

  7. The Designing of CALM (Computer Anxiety and Learning Measure): Validation of a Multidimensional Measure of Anxiety and Cognitions Relating to Adult Learning of Computing Skills Using Structural Equation Modeling.

    ERIC Educational Resources Information Center

    McInerney, Valentina; Marsh, Herbert W.; McInerney, Dennis M.

    This paper discusses the process through which a powerful multidimensional measure of affect and cognition in relation to adult learning of computing skills was derived from its early theoretical stages to its validation using structural equation modeling. The discussion emphasizes the importance of ensuring a strong substantive base from which to…

  8. New accurate theoretical line lists of 12CH4 and 13CH4 in the 0-13400 cm-1 range: Application to the modeling of methane absorption in Titan's atmosphere

    NASA Astrophysics Data System (ADS)

    Rey, Michaël; Nikitin, Andrei V.; Bézard, Bruno; Rannou, Pascal; Coustenis, Athena; Tyuterev, Vladimir G.

    2018-03-01

    The spectrum of methane is very important for the analysis and modeling of Titan's atmosphere but its insufficient knowledge in the near infrared, with the absence of reliable absorption coefficients, is an important limitation. In order to help the astronomer community for analyzing high-quality spectra, we report in the present work the first accurate theoretical methane line lists (T = 50-350 K) of 12CH4 and 13CH4 up to 13400 cm-1 ( > 0.75 μm). These lists are built from extensive variational calculations using our recent ab initio potential and dipole moment surfaces and will be freely accessible via the TheoReTS information system (http://theorets.univ-reims.fr, http://theorets.tsu.ru). Validation of these lists is presented throughout the present paper. For the sample of lines where upper energies were available from published analyses of experimental laboratory 12CH4 spectra, small empirical corrections in positions were introduced that could be useful for future high-resolution applications. We finally apply the TheoRetS line list to model Titan spectra as observed by VIMS and by DISR, respectively onboard Cassini and Huygens. These data are used to check that the TheoReTS line lists are able to model observations. We also make comparisons with other experimental or theoretical line lists. It appears that TheoRetS gives very reliable results better than ExoMol and even than HITRAN2012, except around 1.6 μm where it gives very similar results. We conclude that TheoReTS is suitable to be used for the modeling of planetary radiative transfer and photometry. A re-analysis of spectra recorded by the DISR instrument during the descent of the Huygens probe suggests that the CH4 mixing ratio decreases with altitude in Titan's stratosphere, reaching a value of ∼10-2 above the 110 km altitude.

  9. Achievement Goals and Discrete Achievement Emotions: A Theoretical Model and Prospective Test

    ERIC Educational Resources Information Center

    Pekrun, Reinhard; Elliot, Andrew J.; Maier, Markus A.

    2006-01-01

    A theoretical model linking achievement goals to discrete achievement emotions is proposed. The model posits relations between the goals of the trichotomous achievement goal framework and 8 commonly experienced achievement emotions organized in a 2 (activity/outcome focus) x 2 (positive/negative valence) taxonomy. Two prospective studies tested…

  10. Organizational Learning and Product Design Management: Towards a Theoretical Model.

    ERIC Educational Resources Information Center

    Chiva-Gomez, Ricardo; Camison-Zornoza, Cesar; Lapiedra-Alcami, Rafael

    2003-01-01

    Case studies of four Spanish ceramics companies were used to construct a theoretical model of 14 factors essential to organizational learning. One set of factors is related to the conceptual-analytical phase of the product design process and the other to the creative-technical phase. All factors contributed to efficient product design management…

  11. Theoretical modeling and experimental analyses of laminated wood composite poles

    Treesearch

    Cheng Piao; Todd F. Shupe; Vijaya Gopu; Chung Y. Hse

    2005-01-01

    Wood laminated composite poles consist of trapezoid-shaped wood strips bonded with synthetic resin. The thick-walled hollow poles had adequate strength and stiffness properties and were a promising substitute for solid wood poles. It was necessary to develop theoretical models to facilitate the manufacture and future installation and maintenance of this novel...

  12. Validation of the Work-Life Balance Culture Scale (WLBCS).

    PubMed

    Nitzsche, Anika; Jung, Julia; Kowalski, Christoph; Pfaff, Holger

    2014-01-01

    The purpose of this paper is to describe the theoretical development and initial validation of the newly developed Work-Life Balance Culture Scale (WLBCS), an instrument for measuring an organizational culture that promotes the work-life balance of employees. In Study 1 (N=498), the scale was developed and its factorial validity tested through exploratory factor analyses. In Study 2 (N=513), confirmatory factor analysis (CFA) was performed to examine model fit and retest the dimensional structure of the instrument. To assess construct validity, a priori hypotheses were formulated and subsequently tested using correlation analyses. Exploratory and confirmatory factor analyses revealed a one-factor model. Results of the bivariate correlation analyses may be interpreted as preliminary evidence of the scale's construct validity. The five-item WLBCS is a new and efficient instrument with good overall quality. Its conciseness makes it particularly suitable for use in employee surveys to gain initial insight into a company's perceived work-life balance culture.

  13. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  14. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    PubMed

    Gomes, Anna; van der Wijk, Lars; Proost, Johannes H; Sinha, Bhanu; Touw, Daan J

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  15. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    PubMed Central

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  16. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation

    NASA Astrophysics Data System (ADS)

    Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.

  17. Predicting the ungauged basin: model validation and realism assessment

    NASA Astrophysics Data System (ADS)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  18. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  19. Web-Based Virtual Patients in Nursing Education: Development and Validation of Theory-Anchored Design and Activity Models

    PubMed Central

    2014-01-01

    Background Research has shown that nursing students find it difficult to translate and apply their theoretical knowledge in a clinical context. Virtual patients (VPs) have been proposed as a learning activity that can support nursing students in their learning of scientific knowledge and help them integrate theory and practice. Although VPs are increasingly used in health care education, they still lack a systematic consistency that would allow their reuse outside of their original context. There is therefore a need to develop a model for the development and implementation of VPs in nursing education. Objective The aim of this study was to develop and evaluate a virtual patient model optimized to the learning and assessment needs in nursing education. Methods The process of modeling started by reviewing theoretical frameworks reported in the literature and used by practitioners when designing learning and assessment activities. The Outcome-Present State Test (OPT) model was chosen as the theoretical framework. The model was then, in an iterative manner, developed and optimized to the affordances of virtual patients. Content validation was performed with faculty both in terms of the relevance of the chosen theories but also its applicability in nursing education. The virtual patient nursing model was then instantiated in two VPs. The students’ perceived usefulness of the VPs was investigated using a questionnaire. The result was analyzed using descriptive statistics. Results A virtual patient Nursing Design Model (vpNDM) composed of three layers was developed. Layer 1 contains the patient story and ways of interacting with the data, Layer 2 includes aspects of the iterative process of clinical reasoning, and finally Layer 3 includes measurable outcomes. A virtual patient Nursing Activity Model (vpNAM) was also developed as a guide when creating VP-centric learning activities. The students perceived the global linear VPs as a relevant learning activity for the

  20. Application of a theoretical model to evaluate COPD disease management.

    PubMed

    Lemmens, Karin M M; Nieboer, Anna P; Rutten-Van Mölken, Maureen P M H; van Schayck, Constant P; Asin, Javier D; Dirven, Jos A M; Huijsman, Robbert

    2010-03-26

    Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Implementation of the programme was associated with significant improvements in dyspnoea (p < 0.001) and patient experiences (p < 0.001). No significant improvement was found in mean quality of life scores. Improvements were found in several intermediate outcomes, including investment beliefs (p < 0.05), disease-specific knowledge (p < 0.01; p < 0.001) and medication compliance (p < 0.01). Overall, process improvement was established. The model showed associations between significantly improved intermediate outcomes and improvements in quality of life and dyspnoea. The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care.

  1. Application of a theoretical model to evaluate COPD disease management

    PubMed Central

    2010-01-01

    Background Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. Methods A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Results Implementation of the programme was associated with significant improvements in dyspnoea (p < 0.001) and patient experiences (p < 0.001). No significant improvement was found in mean quality of life scores. Improvements were found in several intermediate outcomes, including investment beliefs (p < 0.05), disease-specific knowledge (p < 0.01; p < 0.001) and medication compliance (p < 0.01). Overall, process improvement was established. The model showed associations between significantly improved intermediate outcomes and improvements in quality of life and dyspnoea. Conclusions The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can

  2. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  3. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  4. CheS-Mapper 2.0 for visual validation of (Q)SAR models

    PubMed Central

    2014-01-01

    Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

  5. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    PubMed

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-06-01

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. Copyright © 2018 Cognitive Science Society, Inc.

  6. Developing, Testing, and Using Theoretical Models for Promoting Quality in Education

    ERIC Educational Resources Information Center

    Creemers, Bert; Kyriakides, Leonidas

    2015-01-01

    This paper argues that the dynamic model of educational effectiveness can be used to establish stronger links between educational effectiveness research (EER) and school improvement. It provides research evidence to support the validity of the model. Thus, the importance of using the dynamic model to establish an evidence-based and theory-driven…

  7. [Theoretical model study about the application risk of high risk medical equipment].

    PubMed

    Shang, Changhao; Yang, Fenghui

    2014-11-01

    Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.

  8. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  9. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  10. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  11. Assessment of bronchial wall thickness and lumen diameter in human adults using multi-detector computed tomography: comparison with theoretical models

    PubMed Central

    Montaudon, M; Desbarats, P; Berger, P; de Dietrich, G; Marthan, R; Laurent, F

    2007-01-01

    A thickened bronchial wall is the morphological substratum of most diseases of the airway. Theoretical and clinical models of bronchial morphometry have so far focused on bronchial lumen diameter, and bronchial length and angles, mainly assessed from bronchial casts. However, these models do not provide information on bronchial wall thickness. This paper reports in vivo values of cross-sectional wall area, lumen area, wall thickness and lumen diameter in ten healthy subjects as assessed by multi-detector computed tomography. A validated dedicated software package was used to measure these morphometric parameters up to the 14th bronchial generation, with respect to Weibel's model of bronchial morphometry, and up to the 12th according to Boyden's classification. Measured lumen diameters and homothety ratios were compared with theoretical values obtained from previously published studies, and no difference was found when considering dichotomic division of the bronchial tree. Mean wall area, lumen area, wall thickness and lumen diameter were then provided according to bronchial generation order, and mean homothety ratios were computed for wall area, lumen area and wall thickness as well as equations giving the mean value of each parameter for a given bronchial generation with respect to its value in generation 0 (trachea). Multi-detector computed tomography measurements of bronchial morphometric parameters may help to improve our knowledge of bronchial anatomy in vivo, our understanding of the pathophysiology of bronchial diseases and the evaluation of pharmacological effects on the bronchial wall. PMID:17919291

  12. PACIC Instrument: disentangling dimensions using published validation models.

    PubMed

    Iglesias, K; Burnand, B; Peytremann-Bridevaux, I

    2014-06-01

    To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. Validation study using data from cross-sectional survey. A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  13. Cross-validation of an employee safety climate model in Malaysia.

    PubMed

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  14. Using Bogner and Wiseman's Model of Ecological Values to Measure the Impact of an Earth Education Programme on Children's Environmental Perceptions

    ERIC Educational Resources Information Center

    Johnson, Bruce; Manoli, Constantinos C.

    2008-01-01

    Investigating the effects of educational programmes on children's environmental perceptions has been hampered by the lack of good theoretical models and valid instruments. In the present study, Bogner and Wiseman's Model of Ecological Values provided a well-developed theoretical model. A validated instrument based on Bogner's Environmental…

  15. Theoretical Assessment of the Impact of Climatic Factors in a Vibrio Cholerae Model.

    PubMed

    Kolaye, G; Damakoa, I; Bowong, S; Houe, R; Békollè, D

    2018-05-04

    A mathematical model for Vibrio Cholerae (V. Cholerae) in a closed environment is considered, with the aim of investigating the impact of climatic factors which exerts a direct influence on the bacterial metabolism and on the bacterial reservoir capacity. We first propose a V. Cholerae mathematical model in a closed environment. A sensitivity analysis using the eFast method was performed to show the most important parameters of the model. After, we extend this V. cholerae model by taking account climatic factors that influence the bacterial reservoir capacity. We present the theoretical analysis of the model. More precisely, we compute equilibria and study their stabilities. The stability of equilibria was investigated using the theory of periodic cooperative systems with a concave nonlinearity. Theoretical results are supported by numerical simulations which further suggest the necessity to implement sanitation campaigns of aquatic environments by using suitable products against the bacteria during the periods of growth of aquatic reservoirs.

  16. Thermodynamic Properties of CO{sub 2} Capture Reaction by Solid Sorbents: Theoretical Predictions and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Yuhua; Luebke, David; Pennline, Henry

    2012-01-01

    It is generally accepted that current technologies for capturing CO{sub 2} are still too energy intensive. Hence, there is a critical need for development of new materials that can capture CO{sub 2} reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO{sub 2} capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO{sub 2} sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculatedmore » thermodynamic properties of different classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO{sub 2} adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO{sub 2} capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. These CO{sub 2} sorbent candidates were further considered for experimental validations. In this presentation, we first introduce our screening methodology with validating by solid dataset of alkali and alkaline metal oxides, hydroxides and bicarbonates which thermodynamic properties are available. Then, by studying a series of lithium silicates, we found that by increasing the Li{sub 2}O/SiO{sub 2} ratio in the lithium silicates their corresponding turnover temperatures for CO{sub 2} capture reactions can be increased. Compared to anhydrous K{sub 2}CO{sub 3}, the dehydrated K{sub 2}CO{sub 3}1.5H{sub 2}O can only be applied for post-combustion CO{sub 2} capture technology at temperatures lower than its phase

  17. Using airborne laser scanning profiles to validate marine geoid models

    NASA Astrophysics Data System (ADS)

    Julge, Kalev; Gruno, Anti; Ellmann, Artu; Liibusk, Aive; Oja, Tõnis

    2014-05-01

    Airborne laser scanning (ALS) is a remote sensing method which utilizes LiDAR (Light Detection And Ranging) technology. The datasets collected are important sources for large range of scientific and engineering applications. Mostly the ALS is used to measure terrain surfaces for compilation of Digital Elevation Models but it can also be used in other applications. This contribution focuses on usage of ALS system for measuring sea surface heights and validating gravimetric geoid models over marine areas. This is based on the ALS ability to register echoes of LiDAR pulse from the water surface. A case study was carried out to analyse the possibilities for validating marine geoid models by using ALS profiles. A test area at the southern shores of the Gulf of Finland was selected for regional geoid validation. ALS measurements were carried out by the Estonian Land Board in spring 2013 at different altitudes and using different scan rates. The one wavelength Leica ALS50-II laser scanner on board of a small aircraft was used to determine the sea level (with respect to the GRS80 reference ellipsoid), which follows roughly the equipotential surface of the Earth's gravity field. For the validation a high-resolution (1'x2') regional gravimetric GRAV-GEOID2011 model was used. This geoid model covers the entire area of Estonia and surrounding waters of the Baltic Sea. The fit between the geoid model and GNSS/levelling data within the Estonian dry land revealed RMS of residuals ±1… ±2 cm. Note that such fitting validation cannot proceed over marine areas. Therefore, an ALS observation-based methodology was developed to evaluate the GRAV-GEOID2011 quality over marine areas. The accuracy of acquired ALS dataset were analyzed, also an optimal width of nadir-corridor containing good quality ALS data was determined. Impact of ALS scan angle range and flight altitude to obtainable vertical accuracy were investigated as well. The quality of point cloud is analysed by cross

  18. [Health, environment and nursing. Philosophical and theoretical foundations for the development and validation of a nursing interface terminology. Part III].

    PubMed

    Juvé-Udina, Maria-Eulàlia

    2012-06-01

    This manuscript is the third of a triad of papers introducing the philosophical and theoretical approaches that support the development and validation of a nursing interface terminology as a standard vocabulary designed to ease data entry into electronic health records, to produce information and to generate knowledge. To analyze the philosophical and theoretical approaches considered in the development of a new nursing interface terminology called ATIC. Review, analysis and discussion of the main philosophical orientations, high and mid-range theories and nursing scientific literature to develop an interpretative conceptualization of the metaparadigm concepts "Health", "Environment" and "Nursing". In the 2 previous papers the ATIC terminology, its foundation on pragmatism, holism, post-positivism and constructivism and the construction of the meaning for the concept elndividualh is discussed. In this third paper, Health is conceptualized as a multidimensional balance state and the concepts of Partial health status, Disease and Being ill are explored within. The analysis of the Environment theories drives its conceptualization as a group of variables that has the potential to affect health status. In this orientation, Nursing is understood as the scientific discipline focused on the study of health status in the particular environment and experience of the individuals, groups, communities or societies. ATIC terminology is rooted on an eclectic philosophical and theoretical foundation, allowing it to be used from different trends within the totality paradigm.

  19. Mechanics of interstitial-lymphatic fluid transport: theoretical foundation and experimental validation.

    PubMed

    Swartz, M A; Kaipainen, A; Netti, P A; Brekken, C; Boucher, Y; Grodzinsky, A J; Jain, R K

    1999-12-01

    Interstitial fluid movement is intrinsically linked to lymphatic drainage. However, their relationship is poorly understood, and associated pathologies are mostly untreatable. In this work we test the hypothesis that bulk tissue fluid movement can be evaluated in situ and described by a linear biphasic theory which integrates the regulatory function of the lymphatics with the mechanical stresses of the tissue. To accomplish this, we develop a novel experimental and theoretical model using the skin of the mouse tail. We then use the model to demonstrate how interstitial-lymphatic fluid movement depends on a balance between the elasticity, hydraulic conductivity, and lymphatic conductance as well as to demonstrate how chronic swelling (edema) alters the equipoise between tissue fluid balance parameters. Specifically, tissue fluid equilibrium is perturbed with a continuous interstitial infusion of saline into the tip of the tail. The resulting gradients in tissue stress are measured in terms of interstitial fluid pressure using a servo-null system. These measurements are then fit to the theory to provide in vivo estimates of the tissue hydraulic conductivity, elastic modulus, and overall resistance to lymphatic drainage. Additional experiments are performed on edematous tails to show that although chronic swelling causes an increase in the hydraulic conductivity, its greatly increased distensibility (due to matrix remodeling) dampens the driving forces for fluid movement and leads to fluid stagnation. This model is useful for examining potential treatments for edema and lymphatic disorders as well as substances which may alter tissue fluid balance and/or lymphatic drainage.

  20. Prediction of enzyme classes from 3D structure: a general model and examples of experimental-theoretic scoring of peptide mass fingerprints of Leishmania proteins.

    PubMed

    Concu, Riccardo; Dea-Ayuela, Maria A; Perez-Montoto, Lazaro G; Bolas-Fernández, Francisco; Prado-Prado, Francisco J; Podda, Gianni; Uriarte, Eugenio; Ubeira, Florencio M; González-Díaz, Humberto

    2009-09-01

    The number of protein and peptide structures included in Protein Data Bank (PDB) and Gen Bank without functional annotation has increased. Consequently, there is a high demand for theoretical models to predict these functions. Here, we trained and validated, with an external set, a Markov Chain Model (MCM) that classifies proteins by their possible mechanism of action according to Enzyme Classification (EC) number. The methodology proposed is essentially new, and enables prediction of all EC classes with a single equation without the need for an equation for each class or nonlinear models with multiple outputs. In addition, the model may be used to predict whether one peptide presents a positive or negative contribution of the activity of the same EC class. The model predicts the first EC number for 106 out of 151 (70.2%) oxidoreductases, 178/178 (100%) transferases, 223/223 (100%) hydrolases, 64/85 (75.3%) lyases, 74/74 (100%) isomerases, and 100/100 (100%) ligases, as well as 745/811 (91.9%) nonenzymes. It is important to underline that this method may help us predict new enzyme proteins or select peptide candidates that improve enzyme activity, which may be of interest for the prediction of new drugs or drug targets. To illustrate the model's application, we report the 2D-Electrophoresis (2DE) isolation from Leishmania infantum as well as MADLI TOF Mass Spectra characterization and theoretical study of the Peptide Mass Fingerprints (PMFs) of a new protein sequence. The theoretical study focused on MASCOT, BLAST alignment, and alignment-free QSAR prediction of the contribution of 29 peptides found in the PMF of the new protein to specific enzyme action. This combined strategy may be used to identify and predict peptides of prokaryote and eukaryote parasites and their hosts as well as other superior organisms, which may be of interest in drug development or target identification.

  1. Theoretical analysis of Lumry-Eyring models in differential scanning calorimetry

    PubMed Central

    Sanchez-Ruiz, Jose M.

    1992-01-01

    A theoretical analysis of several protein denaturation models (Lumry-Eyring models) that include a rate-limited step leading to an irreversibly denatured state of the protein (the final state) has been carried out. The differential scanning calorimetry transitions predicted for these models can be broadly classified into four groups: situations A, B, C, and C′. (A) The transition is calorimetrically irreversible but the rate-limited, irreversible step takes place with significant rate only at temperatures slightly above those corresponding to the transition. Equilibrium thermodynamics analysis is permissible. (B) The transition is distorted by the occurrence of the rate-limited step; nevertheless, it contains thermodynamic information about the reversible unfolding of the protein, which could be obtained upon the appropriate data treatment. (C) The heat absorption is entirely determined by the kinetics of formation of the final state and no thermodynamic information can be extracted from the calorimetric transition; the rate-determining step is the irreversible process itself. (C′) same as C, but, in this case, the rate-determining step is a previous step in the unfolding pathway. It is shown that ligand and protein concentration effects on transitions corresponding to situation C (strongly rate-limited transitions) are similar to those predicted by equilibrium thermodynamics for simple reversible unfolding models. It has been widely held in recent literature that experimentally observed ligand and protein concentration effects support the applicability of equilibrium thermodynamics to irreversible protein denaturation. The theoretical analysis reported here disfavors this claim. PMID:19431826

  2. Underwater photogrammetric theoretical equations and technique

    NASA Astrophysics Data System (ADS)

    Fan, Ya-bing; Huang, Guiping; Qin, Gui-qin; Chen, Zheng

    2011-12-01

    In order to have a high level of accuracy of measurement in underwater close-range photogrammetry, this article deals with a study of three varieties of model equations according to the way of imaging upon the water. First, the paper makes a careful analysis for the two varieties of theoretical equations and finds out that there are some serious limitations in practical application and has an in-depth study for the third model equation. Second, one special project for this measurement has designed correspondingly. Finally, one rigid antenna has been tested by underwater photogrammetry. The experimental results show that the precision of 3D coordinates measurement is 0.94mm, which validates the availability and operability in practical application with this third equation. It can satisfy the measurement requirements of refraction correction, improving levels of accuracy of underwater close-range photogrammetry, as well as strong antijamming and stabilization.

  3. Game Theoretic Modeling of Water Resources Allocation Under Hydro-Climatic Uncertainty

    NASA Astrophysics Data System (ADS)

    Brown, C.; Lall, U.; Siegfried, T.

    2005-12-01

    Typical hydrologic and economic modeling approaches rely on assumptions of climate stationarity and economic conditions of ideal markets and rational decision-makers. In this study, we incorporate hydroclimatic variability with a game theoretic approach to simulate and evaluate common water allocation paradigms. Game Theory may be particularly appropriate for modeling water allocation decisions. First, a game theoretic approach allows economic analysis in situations where price theory doesn't apply, which is typically the case in water resources where markets are thin, players are few, and rules of exchange are highly constrained by legal or cultural traditions. Previous studies confirm that game theory is applicable to water resources decision problems, yet applications and modeling based on these principles is only rarely observed in the literature. Second, there are numerous existing theoretical and empirical studies of specific games and human behavior that may be applied in the development of predictive water allocation models. With this framework, one can evaluate alternative orderings and rules regarding the fraction of available water that one is allowed to appropriate. Specific attributes of the players involved in water resources management complicate the determination of solutions to game theory models. While an analytical approach will be useful for providing general insights, the variety of preference structures of individual players in a realistic water scenario will likely require a simulation approach. We propose a simulation approach incorporating the rationality, self-interest and equilibrium concepts of game theory with an agent-based modeling framework that allows the distinct properties of each player to be expressed and allows the performance of the system to manifest the integrative effect of these factors. Underlying this framework, we apply a realistic representation of spatio-temporal hydrologic variability and incorporate the impact of

  4. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  5. Theoretical and experimental study of a thruster discharging a weight

    NASA Astrophysics Data System (ADS)

    Michaels, Dan; Gany, Alon

    2014-06-01

    An innovative concept for a rocket type thruster that can be beneficial for spacecraft trajectory corrections and station keeping was investigated both experimentally and theoretically. It may also be useful for divert and attitude control systems (DACS). The thruster is based on a combustion chamber discharging a weight through an exhaust tube. Calculations with granular double-base propellant and a solid ejected weight reveal that a specific impulse based on the propellant mass of well above 400 s can be obtained. An experimental thruster was built in order to demonstrate the new idea and validate the model. The thruster impulse was measured both directly with a load cell and indirectly by using a pressure transducer and high speed photography of the weight as it exits the tube, with both ways producing very similar total impulse measurement. The good correspondence between the computations and the measured data validates the model as a useful tool for studying and designing such a thruster.

  6. Patient perceptions of patient-centred care: empirical test of a theoretical model.

    PubMed

    Rathert, Cheryl; Williams, Eric S; McCaughey, Deirdre; Ishqaidef, Ghadir

    2015-04-01

    Patient perception measures are gaining increasing interest among scholars and practitioners. The aim of this study was to empirically examine a conceptual model of patient-centred care using patient perception survey data. Patient-centred care is one of the Institute of Medicine's objectives for improving health care in the 21st century. Patient interviews conducted by the Picker Institute/Commonwealth Fund in the 1980s resulted in a theoretical model and survey questions with dimensions and attributes patients defined as patient-centered. The present study used survey data from patients with overnight visits at 142 U.S. hospitals. Regression analysis found significant support for the theoretical model. Perceptions of emotional support had the strongest relationship with overall care ratings. Coordination of care, and physical comfort were strongly related as well. Understanding how patients experience their care can help improve understanding of what patients believe is patient-centred, and of how care processes relate to important patient outcomes. © 2012 John Wiley & Sons Ltd.

  7. Is the Acute NMDA Receptor Hypofunction a Valid Model of Schizophrenia?

    PubMed Central

    Adell, Albert; Jiménez-Sánchez, Laura; López-Gil, Xavier; Romón, Tamara

    2012-01-01

    Several genetic, neurodevelopmental, and pharmacological animal models of schizophrenia have been established. This short review examines the validity of one of the most used pharmacological model of the illness, ie, the acute administration of N-methyl-D-aspartate (NMDA) receptor antagonists in rodents. In some cases, data on chronic or prenatal NMDA receptor antagonist exposure have been introduced for comparison. The face validity of acute NMDA receptor blockade is granted inasmuch as hyperlocomotion and stereotypies induced by phencyclidine, ketamine, and MK-801 are regarded as a surrogate for the positive symptoms of schizophrenia. In addition, the loss of parvalbumin-containing cells (which is one of the most compelling finding in postmortem schizophrenia brain) following NMDA receptor blockade adds construct validity to this model. However, the lack of changes in glutamic acid decarboxylase (GAD67) is at variance with human studies. It is possible that changes in GAD67 are more reflective of the neurodevelopmental condition of schizophrenia. Finally, the model also has predictive validity, in that its behavioral and transmitter activation in rodents are responsive to antipsychotic treatment. Overall, although not devoid of drawbacks, the acute administration of NMDA receptor antagonists can be considered as a good model of schizophrenia bearing a satisfactory degree of validity. PMID:21965469

  8. Models and Messengers of Resilience: A Theoretical Model of College Students' Resilience, Regulatory Strategy Use, and Academic Achievement

    ERIC Educational Resources Information Center

    Johnson, Marcus L.; Taasoobshirazi, Gita; Kestler, Jessica L.; Cordova, Jackie R.

    2015-01-01

    We tested a theoretical model of college students' ratings of messengers of resilience and models of resilience, students' own perceived resilience, regulatory strategy use and achievement. A total of 116 undergraduates participated in this study. The results of a path analysis indicated that ratings of models of resilience had a direct effect on…

  9. How trees allocate carbon for optimal growth: insight from a game-theoretic model.

    PubMed

    Fu, Liyong; Sun, Lidan; Han, Hao; Jiang, Libo; Zhu, Sheng; Ye, Meixia; Tang, Shouzheng; Huang, Minren; Wu, Rongling

    2017-02-01

    How trees allocate photosynthetic products to primary height growth and secondary radial growth reflects their capacity to best use environmental resources. Despite substantial efforts to explore tree height-diameter relationship empirically and through theoretical modeling, our understanding of the biological mechanisms that govern this phenomenon is still limited. By thinking of stem woody biomass production as an ecological system of apical and lateral growth components, we implement game theory to model and discern how these two components cooperate symbiotically with each other or compete for resources to determine the size of a tree stem. This resulting allometry game theory is further embedded within a genetic mapping and association paradigm, allowing the genetic loci mediating the carbon allocation of stemwood growth to be characterized and mapped throughout the genome. Allometry game theory was validated by analyzing a mapping data of stem height and diameter growth over perennial seasons in a poplar tree. Several key quantitative trait loci were found to interpret the process and pattern of stemwood growth through regulating the ecological interactions of stem apical and lateral growth. The application of allometry game theory enables the prediction of the situations in which the cooperation, competition or altruism is an optimal decision of a tree to fully use the environmental resources it owns. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Sensitivity of the ocean overturning circulation to wind and mixing: theoretical scalings and global ocean models

    NASA Astrophysics Data System (ADS)

    Nikurashin, Maxim; Gunn, Andrew

    2017-04-01

    The meridional overturning circulation (MOC) is a planetary-scale oceanic flow which is of direct importance to the climate system: it transports heat meridionally and regulates the exchange of CO2 with the atmosphere. The MOC is forced by wind and heat and freshwater fluxes at the surface and turbulent mixing in the ocean interior. A number of conceptual theories for the sensitivity of the MOC to changes in forcing have recently been developed and tested with idealized numerical models. However, the skill of the simple conceptual theories to describe the MOC simulated with higher complexity global models remains largely unknown. In this study, we present a systematic comparison of theoretical and modelled sensitivity of the MOC and associated deep ocean stratification to vertical mixing and southern hemisphere westerlies. The results show that theories that simplify the ocean into a single-basin, zonally-symmetric box are generally in a good agreement with a realistic, global ocean circulation model. Some disagreement occurs in the abyssal ocean, where complex bottom topography is not taken into account by simple theories. Distinct regimes, where the MOC has a different sensitivity to wind or mixing, as predicted by simple theories, are also clearly shown by the global ocean model. The sensitivity of the Indo-Pacific, Atlantic, and global basins is analysed separately to validate the conceptual understanding of the upper and lower overturning cells in the theory.

  11. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    NASA Astrophysics Data System (ADS)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  12. Development and initial validation of the Impression Motivation in Sport Questionnaire-Team.

    PubMed

    Payne, Simon Mark; Hudson, Joanne; Akehurst, Sally; Ntoumanis, Nikos

    2013-06-01

    Impression motivation is an important individual difference variable that has been under-researched in sport psychology, partly due to having no appropriate measure. This study was conducted to design a measure of impression motivation in team-sport athletes. Construct validity checks decreased the initial pool of items, factor analysis (n = 310) revealed the structure of the newly developed scale, and exploratory structural equation modeling procedures (n = 406) resulted in a modified scale that retained theoretical integrity and psychometric parsimony. This process produced a 15-item, 4-factor model; the Impression Motivation in Sport Questionnaire-Team (IMSQ-T) is forwarded as a valid measure of the respondent's dispositional strength of motivation to use self-presentation in striving for four distinct interpersonal objectives: self-development, social identity development, avoidance of negative outcomes, and avoidance of damaging impressions. The availability of this measure has contributed to theoretical development, will facilitate research, and offers a tool for use in applied settings.

  13. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  14. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. A theoretical model of speed-dependent steering torque for rolling tyres

    NASA Astrophysics Data System (ADS)

    Wei, Yintao; Oertel, Christian; Liu, Yahui; Li, Xuebing

    2016-04-01

    It is well known that the tyre steering torque is highly dependent on the tyre rolling speed. In limited cases, i.e. parking manoeuvre, the steering torque approaches the maximum. With the increasing tyre speed, the steering torque decreased rapidly. Accurate modelling of the speed-dependent behaviour for the tyre steering torque is a key factor to calibrate the electric power steering (EPS) system and tune the handling performance of vehicles. However, no satisfactory theoretical model can be found in the existing literature to explain this phenomenon. This paper proposes a new theoretical framework to model this important tyre behaviour, which includes three key factors: (1) tyre three-dimensional transient rolling kinematics with turn-slip; (2) dynamical force and moment generation; and (3) the mixed Lagrange-Euler method for contact deformation solving. A nonlinear finite-element code has been developed to implement the proposed approach. It can be found that the main mechanism for the speed-dependent steering torque is due to turn-slip-related kinematics. This paper provides a theory to explain the complex mechanism of the tyre steering torque generation, which helps to understand the speed-dependent tyre steering torque, tyre road feeling and EPS calibration.

  16. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations

  17. Sound transmission through lightweight double-leaf partitions: theoretical modelling

    NASA Astrophysics Data System (ADS)

    Wang, J.; Lu, T. J.; Woodhouse, J.; Langley, R. S.; Evans, J.

    2005-09-01

    This paper presents theoretical modelling of the sound transmission loss through double-leaf lightweight partitions stiffened with periodically placed studs. First, by assuming that the effect of the studs can be replaced with elastic springs uniformly distributed between the sheathing panels, a simple smeared model is established. Second, periodic structure theory is used to develop a more accurate model taking account of the discrete placing of the studs. Both models treat incident sound waves in the horizontal plane only, for simplicity. The predictions of the two models are compared, to reveal the physical mechanisms determining sound transmission. The smeared model predicts relatively simple behaviour, in which the only conspicuous features are associated with coincidence effects with the two types of structural wave allowed by the partition model, and internal resonances of the air between the panels. In the periodic model, many more features are evident, associated with the structure of pass- and stop-bands for structural waves in the partition. The models are used to explain the effects of incidence angle and of the various system parameters. The predictions are compared with existing test data for steel plates with wooden stiffeners, and good agreement is obtained.

  18. Exploring patient satisfaction predictors in relation to a theoretical model.

    PubMed

    Grøndahl, Vigdis Abrahamsen; Hall-Lord, Marie Louise; Karlsson, Ingela; Appelgren, Jari; Wilde-Larsson, Bodil

    2013-01-01

    The aim is to describe patients' care quality perceptions and satisfaction and to explore potential patient satisfaction predictors as person-related conditions, external objective care conditions and patients' perception of actual care received ("PR") in relation to a theoretical model. A cross-sectional design was used. Data were collected using one questionnaire combining questions from four instruments: Quality from patients' perspective; Sense of coherence; Big five personality trait; and Emotional stress reaction questionnaire (ESRQ), together with questions from previous research. In total, 528 patients (83.7 per cent response rate) from eight medical, three surgical and one medical/surgical ward in five Norwegian hospitals participated. Answers from 373 respondents with complete ESRQ questionnaires were analysed. Sequential multiple regression analysis with ESRQ as dependent variable was run in three steps: person-related conditions, external objective care conditions, and PR (p < 0.05). Step 1 (person-related conditions) explained 51.7 per cent of the ESRQ variance. Step 2 (external objective care conditions) explained an additional 2.4 per cent. Step 3 (PR) gave no significant additional explanation (0.05 per cent). Steps 1 and 2 contributed statistical significance to the model. Patients rated both quality-of-care and satisfaction highly. The paper shows that the theoretical model using an emotion-oriented approach to assess patient satisfaction can explain 54 per cent of patient satisfaction in a statistically significant manner.

  19. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  20. Validation of an Evaluation Model for Learning Management Systems

    ERIC Educational Resources Information Center

    Kim, S. W.; Lee, M. G.

    2008-01-01

    This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

  1. [Self-Determination in Medical Rehabilitation - Development of a Conceptual Model for Further Theoretical Discussion].

    PubMed

    Senin, Tatjana; Meyer, Thorsten

    2018-01-22

    Aim was to gather theoretical knowledge about self-determination and to develop a conceptual model for medical rehabilitation- which serves as a basis for discussion. We performed a literature research in electronic databases. Various theories and research results were adopted and transferred to the context of medical rehabilitation and into a conceptual model. The conceptual model of self-determination reflects on a continuum which forms of self-determination may be present in situations of medical rehabilitation treatments. The location on the continuum depends theoretically on the manifestation of certain internal and external factors that may influence each other. The model provides a first conceptualization of self-determination focusing on medical rehabilitation which should be further refined and tested empirically. © Georg Thieme Verlag KG Stuttgart · New York.

  2. A comparative study of a theoretical neural net model with MEG data from epileptic patients and normal individuals.

    PubMed

    Kotini, A; Anninos, P; Anastasiadis, A N; Tamiolakis, D

    2005-09-07

    The aim of this study was to compare a theoretical neural net model with MEG data from epileptic patients and normal individuals. Our experimental study population included 10 epilepsy sufferers and 10 healthy subjects. The recordings were obtained with a one-channel biomagnetometer SQUID in a magnetically shielded room. Using the method of x2-fitting it was found that the MEG amplitudes in epileptic patients and normal subjects had Poisson and Gauss distributions respectively. The Poisson connectivity derived from the theoretical neural model represents the state of epilepsy, whereas the Gauss connectivity represents normal behavior. The MEG data obtained from epileptic areas had higher amplitudes than the MEG from normal regions and were comparable with the theoretical magnetic fields from Poisson and Gauss distributions. Furthermore, the magnetic field derived from the theoretical model had amplitudes in the same order as the recorded MEG from the 20 participants. The approximation of the theoretical neural net model with real MEG data provides information about the structure of the brain function in epileptic and normal states encouraging further studies to be conducted.

  3. Validation of the Intrinsic Spirituality Scale (ISS) with Muslims.

    PubMed

    Hodge, David R; Zidan, Tarek; Husain, Altaf

    2015-12-01

    This study validates an existing spirituality measure--the intrinsic spirituality scale (ISS)--for use with Muslims in the United States. A confirmatory factor analysis was conducted with a diverse sample of self-identified Muslims (N = 281). Validity and reliability were assessed along with criterion and concurrent validity. The measurement model fit the data well, normed χ2 = 2.50, CFI = 0.99, RMSEA = 0.07, and SRMR = 0.02. All 6 items that comprise the ISS demonstrated satisfactory levels of validity (λ > .70) and reliability (R2 > .50). The Cronbach's alpha obtained with the present sample was .93. Appropriate correlations with theoretically linked constructs demonstrated criterion and concurrent validity. The results suggest the ISS is a valid measure of spirituality in clinical settings with the rapidly growing Muslim population. The ISS may, for instance, provide an efficient screening tool to identify Muslims that are particularly likely to benefit from spiritually accommodative treatments. (c) 2015 APA, all rights reserved).

  4. Molprobity's ultimate rotamer-library distributions for model validation.

    PubMed

    Hintze, Bradley J; Lewis, Steven M; Richardson, Jane S; Richardson, David C

    2016-09-01

    Here we describe the updated MolProbity rotamer-library distributions derived from an order-of-magnitude larger and more stringently quality-filtered dataset of about 8000 (vs. 500) protein chains, and we explain the resulting changes and improvements to model validation as seen by users. To include only side-chains with satisfactory justification for their given conformation, we added residue-specific filters for electron-density value and model-to-density fit. The combined new protocol retains a million residues of data, while cleaning up false-positive noise in the multi- χ datapoint distributions. It enables unambiguous characterization of conformational clusters nearly 1000-fold less frequent than the most common ones. We describe examples of local interactions that favor these rare conformations, including the role of authentic covalent bond-angle deviations in enabling presumably strained side-chain conformations. Further, along with favored and outlier, an allowed category (0.3-2.0% occurrence in reference data) has been added, analogous to Ramachandran validation categories. The new rotamer distributions are used for current rotamer validation in MolProbity and PHENIX, and for rotamer choice in PHENIX model-building and refinement. The multi-dimensional χ distributions and Top8000 reference dataset are freely available on GitHub. These rotamers are termed "ultimate" because data sampling and quality are now fully adequate for this task, and also because we believe the future of conformational validation should integrate side-chain with backbone criteria. Proteins 2016; 84:1177-1189. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  5. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  6. Institutional Effectiveness: A Model for Planning, Assessment & Validation.

    ERIC Educational Resources Information Center

    Truckee Meadows Community Coll., Sparks, NV.

    The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

  7. Validation of theoretical pathway between discrimination, diabetes self-care and glycemic control.

    PubMed

    Dawson, Aprill Z; Walker, Rebekah J; Campbell, Jennifer A; Egede, Leonard E

    2016-07-01

    This study examined the mechanisms through which discrimination influences diabetes self-care and glycemic control in patients with diabetes by using structured equation modeling. 615 patients were recruited from two adult primary care clinics in the southeastern United States. Measures were based on a theoretical model and included perceived discrimination, social support, social cohesion, and perceived stress. Structured equation modeling examined the relationship with diabetes self-care and glycemic control. The final model (chi2(211)=328.82, p<0.0001, R(2)=0.99, RMSEA=0.03 and CFI=0.98) shows that higher stress is directly significantly related to a decreased self-care (r=-0.59, p <0.001) and increased HbA1c (r=0.27, p<0.05). There was no significant direct association between discrimination, social support or social cohesion, and glycemic control or self-care. There was, however, a direct significant association between increased discrimination (r=0.46, p<0.001), decreased social support (r=-0.34, p<0.001), increased social cohesion (r=0.14, p<0.05) and increased stress. These results support the hypothesized pathway of discrimination on health outcomes, showing both a direct and indirect influence through stress on HbA1c in adults with diabetes. Understanding the pathways through which discrimination influences diabetes outcomes is important for providing more comprehensive and effective care. These results suggest future interventions targeting patients with diabetes should take discrimination-induced stress into account. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. An experimental and theoretical analysis of a foil-air bearing rotor system

    NASA Astrophysics Data System (ADS)

    Bonello, P.; Hassan, M. F. Bin

    2018-01-01

    Although there is considerable research on the experimental testing of foil-air bearing (FAB) rotor systems, only a small fraction has been correlated with simulations from a full nonlinear model that links the rotor, air film and foil domains, due to modelling complexity and computational burden. An approach for the simultaneous solution of the three domains as a coupled dynamical system, introduced by the first author and adopted by independent researchers, has recently demonstrated its capability to address this problem. This paper uses this approach, with further developments, in an experimental and theoretical study of a FAB-rotor test rig. The test rig is described in detail, including issues with its commissioning. The theoretical analysis uses a recently introduced modal-based bump foil model that accounts for interaction between the bumps and their inertia. The imposition of pressure constraints on the air film is found to delay the predicted onset of instability speed. The results lend experimental validation to a recent theoretically-based claim that the Gümbel condition may not be appropriate for a practical single-pad FAB. The satisfactory prediction of the salient features of the measured nonlinear behavior shows that the air film is indeed highly influential on the response, in contrast to an earlier finding.

  9. How Career Variety Promotes the Adaptability of Managers: A Theoretical Model

    ERIC Educational Resources Information Center

    Karaevli, Ayse; Tim Hall, Douglas T.

    2006-01-01

    This paper presents a theoretical model showing how managerial adaptability develops from career variety over the span of the person's career. By building on the literature of career theory, adult learning and development, and career adjustment, we offer a new conceptualization of managerial adaptability by identifying its behavioral, cognitive,…

  10. Models of the Bilingual Lexicon and Their Theoretical Implications for CLIL

    ERIC Educational Resources Information Center

    Heine, Lena

    2014-01-01

    Although many advances have been made in recent years concerning the theoretical dimensions of content and language integrated learning (CLIL), research still has to meet the necessity to come up with integrative models that adequately map the interrelation between content and language learning in CLIL contexts. This article will suggest that…

  11. Physics of human cooperation: experimental evidence and theoretical models

    NASA Astrophysics Data System (ADS)

    Sánchez, Angel

    2018-02-01

    In recent years, many physicists have used evolutionary game theory combined with a complex systems perspective in an attempt to understand social phenomena and challenges. Prominent among such phenomena is the issue of the emergence and sustainability of cooperation in a networked world of selfish or self-focused individuals. The vast majority of research done by physicists on these questions is theoretical, and is almost always posed in terms of agent-based models. Unfortunately, more often than not such models ignore a number of facts that are well established experimentally, and are thus rendered irrelevant to actual social applications. I here summarize some of the facts that any realistic model should incorporate and take into account, discuss important aspects underlying the relation between theory and experiments, and discuss future directions for research based on the available experimental knowledge.

  12. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  13. Validating a Technology Enhanced Student-Centered Learning Model

    ERIC Educational Resources Information Center

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  14. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the

  15. Perception of competence in middle school physical education: instrument development and validation.

    PubMed

    Scrabis-Fletcher, Kristin; Silverman, Stephen

    2010-03-01

    Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A multiphase design was used consisting of an intensive theoretical review, elicitation study, prepilot study, pilot study, content validation study, and final validation study (N=1281). Data analysis included a multistep iterative process to identify the best model fit. A three-factor model for POC was tested and resulted in root mean square error of approximation = .09, root mean square residual = .07, goodness offit index = .90, and adjusted goodness offit index = .86 values in the acceptable range (Hu & Bentler, 1999). A two-factor model was also tested and resulted in a good fit (two-factor fit indexes values = .05, .03, .98, .97, respectively). The results of this study suggest that an instrument using a three- or two-factor model provides reliable and valid scores ofPOC measurement in middle school PE.

  16. Finite Element Model Development and Validation for Aircraft Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

    2000-01-01

    The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

  17. A new method for assessing content validity in model-based creation and iteration of eHealth interventions.

    PubMed

    Kassam-Adams, Nancy; Marsac, Meghan L; Kohser, Kristen L; Kenardy, Justin A; March, Sonja; Winston, Flaura K

    2015-04-15

    The advent of eHealth interventions to address psychological concerns and health behaviors has created new opportunities, including the ability to optimize the effectiveness of intervention activities and then deliver these activities consistently to a large number of individuals in need. Given that eHealth interventions grounded in a well-delineated theoretical model for change are more likely to be effective and that eHealth interventions can be costly to develop, assuring the match of final intervention content and activities to the underlying model is a key step. We propose to apply the concept of "content validity" as a crucial checkpoint to evaluate the extent to which proposed intervention activities in an eHealth intervention program are valid (eg, relevant and likely to be effective) for the specific mechanism of change that each is intended to target and the intended target population for the intervention. The aims of this paper are to define content validity as it applies to model-based eHealth intervention development, to present a feasible method for assessing content validity in this context, and to describe the implementation of this new method during the development of a Web-based intervention for children. We designed a practical 5-step method for assessing content validity in eHealth interventions that includes defining key intervention targets, delineating intervention activity-target pairings, identifying experts and using a survey tool to gather expert ratings of the relevance of each activity to its intended target, its likely effectiveness in achieving the intended target, and its appropriateness with a specific intended audience, and then using quantitative and qualitative results to identify intervention activities that may need modification. We applied this method during our development of the Coping Coach Web-based intervention for school-age children. In the evaluation of Coping Coach content validity, 15 experts from five countries

  18. Effect of differentiation of self on adolescent risk behavior: test of the theoretical model.

    PubMed

    Knauth, Donna G; Skowron, Elizabeth A; Escobar, Melicia

    2006-01-01

    Innovative theoretical models are needed to explain the occurrence of high-risk sexual behaviors, alcohol and other-drug (AOD) use, and academic engagement among ethnically diverse, inner-city adolescents. The aim of this study was to test the credibility of a theoretical model based on the Bowen family systems theory to explain adolescent risk behavior. Specifically tested was the relationship between the predictor variables of differentiation of self, chronic anxiety, and social problem solving and the dependent variables of high-risk sexual behaviors, AOD use, and academic engagement. An ex post facto cross-sectional design was used to test the usefulness of the theoretical model. Data were collected from 161 racially/ethnically diverse, inner-city high school students, 14 to 19 years of age. Participants completed self-report written questionnaires, including the Differentiation of Self Inventory, State-Trait Anxiety Inventory, Social Problem Solving for Adolescents, Drug Involvement Scale for Adolescents, and the Sexual Behavior Questionnaire. Consistent with the model, higher levels of differentiation of self related to lower levels of chronic anxiety (p < .001) and higher levels of social problem solving (p < .01). Higher chronic anxiety was related to lower social problem solving (p < .001). A test of mediation showed that chronic anxiety mediates the relationship between differentiation of self and social problem solving (p < .001), indicating that differentiation influences social problem solving through chronic anxiety. Higher levels of social problem solving were related to less drug use (p < .05), less high-risk sexual behaviors (p < .01), and an increase in academic engagement (p < .01). Findings support the theoretical model's credibility and provide evidence that differentiation of self is an important cognitive factor that enables adolescents to manage chronic anxiety and motivates them to use effective problem solving, resulting in less

  19. Finite Element Model and Validation of Nasal Tip Deformation

    PubMed Central

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian JF

    2016-01-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39mm ± 1.04 mm and deviated up to 2mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow. PMID:27633018

  20. Finite Element Model and Validation of Nasal Tip Deformation.

    PubMed

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  1. A Theoretical Sketch of Medical Professionalism as a Normative Complex

    ERIC Educational Resources Information Center

    Holtman, Matthew C.

    2008-01-01

    Validity arguments for assessment tools intended to measure medical professionalism suffer for lack of a clear theoretical statement of what professionalism is and how it should behave. Drawing on several decades of field research addressing deviance and informal social control among physicians, a theoretical sketch of professionalism is presented…

  2. Recent progress in the theoretical modelling of Cepheids and RR Lyrae stars

    NASA Astrophysics Data System (ADS)

    Marconi, Marcella

    2017-09-01

    Cepheids and RR Lyrae are among the most important primary distance indicators to calibrate the extragalactic distance ladder and excellent stellar population tracers, for Population I and Population II, respectively. In this paper I first mention some recent theoretical studies of Cepheids and RR Lyrae obtained with different theoretical tools. Then I focus the attention on new results based on nonlinear convective pulsation models in the context of some international projects, including VMC@VISTA and the Gaia collaboration. The open problems for both Cepheids and RR Lyrae are briefly discussed together with some challenging future application.

  3. A Modified Theoretical Model of Intrinsic Hardness of Crystalline Solids

    PubMed Central

    Dai, Fu-Zhi; Zhou, Yanchun

    2016-01-01

    Super-hard materials have been extensively investigated due to their practical importance in numerous industrial applications. To stimulate the design and exploration of new super-hard materials, microscopic models that elucidate the fundamental factors controlling hardness are desirable. The present work modified the theoretical model of intrinsic hardness proposed by Gao. In the modification, we emphasize the critical role of appropriately decomposing a crystal to pseudo-binary crystals, which should be carried out based on the valence electron population of each bond. After modification, the model becomes self-consistent and predicts well the hardness values of many crystals, including crystals composed of complex chemical bonds. The modified model provides fundamental insights into the nature of hardness, which can facilitate the quest for intrinsic super-hard materials. PMID:27604165

  4. Developing a theoretical maintenance model for disordered eating in Type 1 diabetes.

    PubMed

    Treasure, J; Kan, C; Stephenson, L; Warren, E; Smith, E; Heller, S; Ismail, K

    2015-12-01

    According to the literature, eating disorders are an increasing problem for more than a quarter of people with Type 1 diabetes and they are associated with accentuated diabetic complications. The clinical outcomes in this group when given standard eating disorder treatments are disappointing. The Medical Research Council guidelines for developing complex interventions suggest that the first step is to develop a theoretical model. To review existing literature to build a theoretical maintenance model for disordered eating in people with Type 1 diabetes. The literature in diabetes relating to models of eating disorder (Fairburn's transdiagnostic model and the dual pathway model) and food addiction was examined and assimilated. The elements common to all eating disorder models include weight/shape concern and problems with mood regulation. The predisposing traits of perfectionism, low self-esteem and low body esteem and the interpersonal difficulties from the transdiagnostic model are also relevant to diabetes. The differences include the use of insulin mismanagement to compensate for breaking eating rules and the consequential wide variations in plasma glucose that may predispose to 'food addiction'. Eating disorder symptoms elicit emotionally driven reactions and behaviours from others close to the individual affected and these are accentuated in the context of diabetes. The next stage is to test the assumptions within the maintenance model with experimental medicine studies to facilitate the development of new technologies aimed at increasing inhibitory processes and moderating environmental triggers. © 2015 The Authors. Diabetic Medicine © 2015 Diabetes UK.

  5. Theoretical modeling of time-dependent skin temperature and heat losses during whole-body cryotherapy: A pilot study.

    PubMed

    Polidori, G; Marreiro, A; Pron, H; Lestriez, P; Boyer, F C; Quinart, H; Tourbah, A; Taïar, R

    2016-11-01

    This article establishes the basics of a theoretical model for the constitutive law that describes the skin temperature and thermolysis heat losses undergone by a subject during a session of whole-body cryotherapy (WBC). This study focuses on the few minutes during which the human body is subjected to a thermal shock. The relationship between skin temperature and thermolysis heat losses during this period is still unknown and have not yet been studied in the context of the whole human body. The analytical approach here is based on the hypothesis that the skin thermal shock during a WBC session can be thermally modelled by the sum of both radiative and free convective heat transfer functions. The validation of this scientific approach and the derivation of temporal evolution thermal laws, both on skin temperature and dissipated thermal power during the thermal shock open many avenues of large scale studies with the aim of proposing individualized cryotherapy protocols as well as protocols intended for target populations. Furthermore, this study shows quantitatively the substantial imbalance between human metabolism and thermolysis during WBC, the explanation of which remains an open question. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A Theoretical Model for Estimation of Yield Strength of Fiber Metal Laminate

    NASA Astrophysics Data System (ADS)

    Bhat, Sunil; Nagesh, Suresh; Umesh, C. K.; Narayanan, S.

    2017-08-01

    The paper presents a theoretical model for estimation of yield strength of fiber metal laminate. Principles of elasticity and formulation of residual stress are employed to determine the stress state in metal layer of the laminate that is found to be higher than the stress applied over the laminate resulting in reduced yield strength of the laminate in comparison with that of the metal layer. The model is tested over 4A-3/2 Glare laminate comprising three thin aerospace 2014-T6 aluminum alloy layers alternately bonded adhesively with two prepregs, each prepreg built up of three uni-directional glass fiber layers laid in longitudinal and transverse directions. Laminates with prepregs of E-Glass and S-Glass fibers are investigated separately under uni-axial tension. Yield strengths of both the Glare variants are found to be less than that of aluminum alloy with use of S-Glass fiber resulting in higher laminate yield strength than with the use of E-Glass fiber. Results from finite element analysis and tensile tests conducted over the laminates substantiate the theoretical model.

  7. Likert or Not, Survey (In)Validation Requires Explicit Theories and True Grit

    ERIC Educational Resources Information Center

    McGrane, Joshua A.; Nowland, Trisha

    2017-01-01

    From the time of Likert (1932) on, attitudes of expediency regarding both theory and methodology became apparent with reference to survey construction and validation practices. In place of theory and more--theoretically minded methods, such as those found in the early work of Thurstone (1928) and Coombs (1964), statistical models and…

  8. Validation and Trustworthiness of Multiscale Models of Cardiac Electrophysiology

    PubMed Central

    Pathmanathan, Pras; Gray, Richard A.

    2018-01-01

    Computational models of cardiac electrophysiology have a long history in basic science applications and device design and evaluation, but have significant potential for clinical applications in all areas of cardiovascular medicine, including functional imaging and mapping, drug safety evaluation, disease diagnosis, patient selection, and therapy optimisation or personalisation. For all stakeholders to be confident in model-based clinical decisions, cardiac electrophysiological (CEP) models must be demonstrated to be trustworthy and reliable. Credibility, that is, the belief in the predictive capability, of a computational model is primarily established by performing validation, in which model predictions are compared to experimental or clinical data. However, there are numerous challenges to performing validation for highly complex multi-scale physiological models such as CEP models. As a result, credibility of CEP model predictions is usually founded upon a wide range of distinct factors, including various types of validation results, underlying theory, evidence supporting model assumptions, evidence from model calibration, all at a variety of scales from ion channel to cell to organ. Consequently, it is often unclear, or a matter for debate, the extent to which a CEP model can be trusted for a given application. The aim of this article is to clarify potential rationale for the trustworthiness of CEP models by reviewing evidence that has been (or could be) presented to support their credibility. We specifically address the complexity and multi-scale nature of CEP models which makes traditional model evaluation difficult. In addition, we make explicit some of the credibility justification that we believe is implicitly embedded in the CEP modeling literature. Overall, we provide a fresh perspective to CEP model credibility, and build a depiction and categorisation of the wide-ranging body of credibility evidence for CEP models. This paper also represents a step

  9. Ethical decision-making climate in the ICU: theoretical framework and validation of a self-assessment tool.

    PubMed

    Van den Bulcke, Bo; Piers, Ruth; Jensen, Hanne Irene; Malmgren, Johan; Metaxa, Victoria; Reyners, Anna K; Darmon, Michael; Rusinova, Katerina; Talmor, Daniel; Meert, Anne-Pascale; Cancelliere, Laura; Zubek, Làszló; Maia, Paolo; Michalsen, Andrej; Decruyenaere, Johan; Kompanje, Erwin J O; Azoulay, Elie; Meganck, Reitske; Van de Sompel, Ariëlla; Vansteelandt, Stijn; Vlerick, Peter; Vanheule, Stijn; Benoit, Dominique D

    2018-02-23

    Literature depicts differences in ethical decision-making (EDM) between countries and intensive care units (ICU). To better conceptualise EDM climate in the ICU and to validate a tool to assess EDM climates. Using a modified Delphi method, we built a theoretical framework and a self-assessment instrument consisting of 35 statements. This Ethical Decision-Making Climate Questionnaire (EDMCQ) was developed to capture three EDM domains in healthcare: interdisciplinary collaboration and communication; leadership by physicians; and ethical environment. This instrument was subsequently validated among clinicians working in 68 adult ICUs in 13 European countries and the USA. Exploratory and confirmatory factor analysis was used to determine the structure of the EDM climate as perceived by clinicians. Measurement invariance was tested to make sure that variables used in the analysis were comparable constructs across different groups. Of 3610 nurses and 1137 physicians providing ICU bedside care, 2275 (63.1%) and 717 (62.9%) participated respectively. Statistical analyses revealed that a shortened 32-item version of the EDMCQ scale provides a factorial valid measurement of seven facets of the extent to which clinicians perceive an EDM climate: self-reflective and empowering leadership by physicians; practice and culture of open interdisciplinary reflection; culture of not avoiding end-of-life decisions; culture of mutual respect within the interdisciplinary team; active involvement of nurses in end-of-life care and decision-making; active decision-making by physicians; and practice and culture of ethical awareness. Measurement invariance of the EDMCQ across occupational groups was shown, reflecting that nurses and physicians interpret the EDMCQ items in a similar manner. The 32-item version of the EDMCQ might enrich the EDM climate measurement, clinicians' behaviour and the performance of healthcare organisations. This instrument offers opportunities to develop tailored ICU

  10. Theoretical model predictions and experimental results for a wavelength switchable Tm:YAG laser.

    PubMed

    Niu, Yanxiong; Wang, Caili; Liu, Wenwen; Niu, Haisha; Xu, Bing; Man, Da

    2014-07-01

    We present a theoretical model study of a quasi-three-level laser with particular attention given to the Tm:YAG laser. The oscillating conditions of this laser were theoretically analyzed from the point of the pump threshold while taking into account reabsorption loss. The laser oscillation at 2.02 μm with large stimulated emission sections was suppressed by selecting the appropriate coating for the cavity mirrors, then an efficient laser-diode side-pumped continuous-wave Tm:YAG crystal laser operating at 2.07 μm was realized. Experiments with the Tm:YAG laser confirmed the accuracy of the model, and the model was able to accurately predict that the high Stark sub-level within the H36 ground state manifold has a low laser threshold and long laser wavelength, which was achieved by decreasing the transmission of the output coupler.

  11. TSAR, a new graph-theoretical approach to computational modeling of protein side-chain flexibility: modeling of ionization properties of proteins.

    PubMed

    Stroganov, Oleg V; Novikov, Fedor N; Zeifman, Alexey A; Stroylov, Viktor S; Chilov, Ghermes G

    2011-09-01

    A new graph-theoretical approach called thermodynamic sampling of amino acid residues (TSAR) has been elaborated to explicitly account for the protein side chain flexibility in modeling conformation-dependent protein properties. In TSAR, a protein is viewed as a graph whose nodes correspond to structurally independent groups and whose edges connect the interacting groups. Each node has its set of states describing conformation and ionization of the group, and each edge is assigned an array of pairwise interaction potentials between the adjacent groups. By treating the obtained graph as a belief-network-a well-established mathematical abstraction-the partition function of each node is found. In the current work we used TSAR to calculate partition functions of the ionized forms of protein residues. A simplified version of a semi-empirical molecular mechanical scoring function, borrowed from our Lead Finder docking software, was used for energy calculations. The accuracy of the resulting model was validated on a set of 486 experimentally determined pK(a) values of protein residues. The average correlation coefficient (R) between calculated and experimental pK(a) values was 0.80, ranging from 0.95 (for Tyr) to 0.61 (for Lys). It appeared that the hydrogen bond interactions and the exhaustiveness of side chain sampling made the most significant contribution to the accuracy of pK(a) calculations. Copyright © 2011 Wiley-Liss, Inc.

  12. Development Mechanism of an Integrated Model for Training of a Specialist and Conceptual-Theoretical Activity of a Teacher

    ERIC Educational Resources Information Center

    Marasulov, Akhmat; Saipov, Amangeldi; ?rymbayeva, Kulimkhan; Zhiyentayeva, Begaim; Demeuov, Akhan; Konakbaeva, Ulzhamal; Bekbolatova, Akbota

    2016-01-01

    The aim of the study is to examine the methodological-theoretical construction bases for development mechanism of an integrated model for a specialist's training and teacher's conceptual-theoretical activity. Using the methods of generalization of teaching experience, pedagogical modeling and forecasting, the authors determine the urgent problems…

  13. Using the split Hopkinson pressure bar to validate material models.

    PubMed

    Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

    2014-08-28

    This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  14. Construct validity of the ovine model in endoscopic sinus surgery training.

    PubMed

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P < .001). Experience of the intermediate group was variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  15. Community-wide validation of geospace model local K-index predictions to support model transition to operations

    NASA Astrophysics Data System (ADS)

    Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.

    2016-07-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  16. Community-Wide Validation of Geospace Model Local K-Index Predictions to Support Model Transition to Operations

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; hide

    2016-01-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  17. Making Validated Educational Models Central in Preschool Standards.

    ERIC Educational Resources Information Center

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  18. Theoretical model of gravitational perturbation of current collector axisymmetric flow field

    NASA Astrophysics Data System (ADS)

    Walker, John S.; Brown, Samuel H.; Sondergaard, Neal A.

    1989-03-01

    Some designs of liquid metal collectors in homopolar motors and generators are essentially rotating liquid metal fluids in cylindrical channels with free surfaces and will, at critical rotational speeds, become unstable. The role of gravity in modifying this ejection instability is investigated. Some gravitational effects can be theoretically treated by perturbation techniques on the axisymmetric base flow of the liquid metal. This leads to a modification of previously calculated critical current collector ejection values neglecting gravity effects. The derivation of the mathematical model which determines the perturbation of the liquid metal base flow due to gravitational effects is documented. Since gravity is a small force compared with the centrifugal effects, the base flow solutions can be expanded in inverse powers of the Froude number and modified liquid flow profiles can be determined as a function of the azimuthal angle. This model will be used in later work to theoretically study the effects of gravity on the ejection point of the current collector. A rederivation of the hydrodynamic instability threshold of a liquid metal current collector is presented.

  19. Theoretical model of gravitational perturbation of current collector axisymmetric flow field

    NASA Astrophysics Data System (ADS)

    Walker, John S.; Brown, Samuel H.; Sondergaard, Neal A.

    1990-05-01

    Some designs of liquid-metal current collectors in homopolar motors and generators are essentially rotating liquid-metal fluids in cylindrical channels with free surfaces and will, at critical rotational speeds, become unstable. An investigation at David Taylor Research Center is being performed to understand the role of gravity in modifying this ejection instability. Some gravitational effects can be theoretically treated by perturbation techniques on the axisymmetric base flow of the liquid metal. This leads to a modification of previously calculated critical-current-collector ejection values neglecting gravity effects. The purpose of this paper is to document the derivation of the mathematical model which determines the perturbation of the liquid-metal base flow due to gravitational effects. Since gravity is a small force compared with the centrifugal effects, the base flow solutions can be expanded in inverse powers of the Froude number and modified liquid-flow profiles can be determined as a function of the azimuthal angle. This model will be used in later work to theoretically study the effects of gravity on the ejection point of the current collector.

  20. Theoretical Model for Cellular Shapes Driven by Protrusive and Adhesive Forces

    PubMed Central

    Kabaso, Doron; Shlomovitz, Roie; Schloen, Kathrin; Stradal, Theresia; Gov, Nir S.

    2011-01-01

    The forces that arise from the actin cytoskeleton play a crucial role in determining the cell shape. These include protrusive forces due to actin polymerization and adhesion to the external matrix. We present here a theoretical model for the cellular shapes resulting from the feedback between the membrane shape and the forces acting on the membrane, mediated by curvature-sensitive membrane complexes of a convex shape. In previous theoretical studies we have investigated the regimes of linear instability where spontaneous formation of cellular protrusions is initiated. Here we calculate the evolution of a two dimensional cell contour beyond the linear regime and determine the final steady-state shapes arising within the model. We find that shapes driven by adhesion or by actin polymerization (lamellipodia) have very different morphologies, as observed in cells. Furthermore, we find that as the strength of the protrusive forces diminish, the system approaches a stabilization of a periodic pattern of protrusions. This result can provide an explanation for a number of puzzling experimental observations regarding cellular shape dependence on the properties of the extra-cellular matrix. PMID:21573201

  1. Wireless Networks under a Backoff Attack: A Game Theoretical Perspective

    PubMed Central

    Zazo, Santiago

    2018-01-01

    We study a wireless sensor network using CSMA/CA in the MAC layer under a backoff attack: some of the sensors of the network are malicious and deviate from the defined contention mechanism. We use Bianchi’s network model to study the impact of the malicious sensors on the total network throughput, showing that it causes the throughput to be unfairly distributed among sensors. We model this conflict using game theory tools, where each sensor is a player. We obtain analytical solutions and propose an algorithm, based on Regret Matching, to learn the equilibrium of the game with an arbitrary number of players. Our approach is validated via simulations, showing that our theoretical predictions adjust to reality. PMID:29385752

  2. Wireless Networks under a Backoff Attack: A Game Theoretical Perspective.

    PubMed

    Parras, Juan; Zazo, Santiago

    2018-01-30

    We study a wireless sensor network using CSMA/CA in the MAC layer under a backoff attack: some of the sensors of the network are malicious and deviate from the defined contention mechanism. We use Bianchi's network model to study the impact of the malicious sensors on the total network throughput, showing that it causes the throughput to be unfairly distributed among sensors. We model this conflict using game theory tools, where each sensor is a player. We obtain analytical solutions and propose an algorithm, based on Regret Matching, to learn the equilibrium of the game with an arbitrary number of players. Our approach is validated via simulations, showing that our theoretical predictions adjust to reality.

  3. Web Based Semi-automatic Scientific Validation of Models of the Corona and Inner Heliosphere

    NASA Astrophysics Data System (ADS)

    MacNeice, P. J.; Chulaki, A.; Taktakishvili, A.; Kuznetsova, M. M.

    2013-12-01

    Validation is a critical step in preparing models of the corona and inner heliosphere for future roles supporting either or both the scientific research community and the operational space weather forecasting community. Validation of forecasting quality tends to focus on a short list of key features in the model solutions, with an unchanging order of priority. Scientific validation exposes a much larger range of physical processes and features, and as the models evolve to better represent features of interest, the research community tends to shift its focus to other areas which are less well understood and modeled. Given the more comprehensive and dynamic nature of scientific validation, and the limited resources available to the community to pursue this, it is imperative that the community establish a semi-automated process which engages the model developers directly into an ongoing and evolving validation process. In this presentation we describe the ongoing design and develpment of a web based facility to enable this type of validation of models of the corona and inner heliosphere, on the growing list of model results being generated, and on strategies we have been developing to account for model results that incorporate adaptively refined numerical grids.

  4. The experimental-theoretical model of the jet HF induction discharge of atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Gainullin, R.; Kirpichnikov, A.

    2017-11-01

    The paper considers theexperimental-theoretical model devised to determine the regularities of the quasi-stationary electromagnetic field structure of the HFI discharge burning in the inductor of finite dimensions at atmospheric pressure.

  5. Out-of-plane buckling of pantographic fabrics in displacement-controlled shear tests: experimental results and model validation

    NASA Astrophysics Data System (ADS)

    Barchiesi, Emilio; Ganzosch, Gregor; Liebold, Christian; Placidi, Luca; Grygoruk, Roman; Müller, Wolfgang H.

    2018-01-01

    Due to the latest advancements in 3D printing technology and rapid prototyping techniques, the production of materials with complex geometries has become more affordable than ever. Pantographic structures, because of their attractive features, both in dynamics and statics and both in elastic and inelastic deformation regimes, deserve to be thoroughly investigated with experimental and theoretical tools. Herein, experimental results relative to displacement-controlled large deformation shear loading tests of pantographic structures are reported. In particular, five differently sized samples are analyzed up to first rupture. Results show that the deformation behavior is strongly nonlinear, and the structures are capable of undergoing large elastic deformations without reaching complete failure. Finally, a cutting edge model is validated by means of these experimental results.

  6. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  7. Dementia Grief: A Theoretical Model of a Unique Grief Experience

    PubMed Central

    Blandin, Kesstan; Pepin, Renee

    2016-01-01

    Previous literature reveals a high prevalence of grief in dementia caregivers before physical death of the person with dementia that is associated with stress, burden, and depression. To date, theoretical models and therapeutic interventions with grief in caregivers have not adequately considered the grief process, but instead have focused on grief as a symptom that manifests within the process of caregiving. The Dementia Grief Model explicates the unique process of pre-death grief in dementia caregivers. In this paper we introduce the Dementia Grief Model, describe the unique characteristics dementia grief, and present the psychological states associated with the process of dementia grief. The model explicates an iterative grief process involving three states – separation, liminality, and re-emergence – each with a dynamic mechanism that facilitates or hinders movement through the dementia grief process. Finally, we offer potential applied research questions informed by the model. PMID:25883036

  8. Bolometric Luminosities of Peculiar Type II-P Supernovae: Observational and Theoretical Approaches

    NASA Astrophysics Data System (ADS)

    Lusk, Jeremy Alexander

    2018-01-01

    In the three decades since the explosion of SN 1987A, only a handful of other supernovae have been detected which are also thought to originate from blue supergiant progenitors. In this study, we use the five best observed of these supernovae (SNe 1998A, 2000cb, 2006V, 2006au, and 2009E) to examine the bolometric properties of the class through observations and theoretical models. Several techniques for taking photometric observations and inferring bolometric luminosities have been used in the literature. Our newly-improved python package SuperBoL implements many of these techniques. The challenge remains that the true bolometric luminosity of the supernova cannot be directly observed. We must turn to theoretical models in order to examine the validity of the different observationally-based techniques. In this work, we make use of the NLTE generalized atmosphere code PHOENIX to produce synthetic spectra of known luminosity which match the observed supernova spectra. Synthetic photometry of these models is then used as input to SuperBoL to test the different observationally-based bolometric luminosity techniques.

  9. Falling Chains as Variable-Mass Systems: Theoretical Model and Experimental Analysis

    ERIC Educational Resources Information Center

    de Sousa, Celia A.; Gordo, Paulo M.; Costa, Pedro

    2012-01-01

    In this paper, we revisit, theoretically and experimentally, the fall of a folded U-chain and of a pile-chain. The model calculation implies the division of the whole system into two subsystems of variable mass, allowing us to explore the role of tensional contact forces at the boundary of the subsystems. This justifies, for instance, that the…

  10. Forward ultrasonic model validation using wavefield imaging methods

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.

    2018-04-01

    The validation of forward ultrasonic wave propagation models in a complex titanium polycrystalline material system is accomplished using wavefield imaging methods. An innovative measurement approach is described that permits the visualization and quantitative evaluation of bulk elastic wave propagation and scattering behaviors in the titanium material for a typical focused immersion ultrasound measurement process. Results are provided for the determination and direct comparison of the ultrasonic beam's focal properties, mode-converted shear wave position and angle, and scattering and reflection from millimeter-sized microtexture regions (MTRs) within the titanium material. The approach and results are important with respect to understanding the root-cause backscatter signal responses generated in aerospace engine materials, where model-assisted methods are being used to understand the probabilistic nature of the backscatter signal content. Wavefield imaging methods are shown to be an effective means for corroborating and validating important forward model predictions in a direct manner using time- and spatially-resolved displacement field amplitude measurements.

  11. Adaptive selection and validation of models of complex systems in the presence of uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell-Maupin, Kathryn; Oden, J. T.

    This study describes versions of OPAL, the Occam-Plausibility Algorithm in which the use of Bayesian model plausibilities is replaced with information theoretic methods, such as the Akaike Information Criterion and the Bayes Information Criterion. Applications to complex systems of coarse-grained molecular models approximating atomistic models of polyethylene materials are described. All of these model selection methods take into account uncertainties in the model, the observational data, the model parameters, and the predicted quantities of interest. A comparison of the models chosen by Bayesian model selection criteria and those chosen by the information-theoretic criteria is given.

  12. Adaptive selection and validation of models of complex systems in the presence of uncertainty

    DOE PAGES

    Farrell-Maupin, Kathryn; Oden, J. T.

    2017-08-01

    This study describes versions of OPAL, the Occam-Plausibility Algorithm in which the use of Bayesian model plausibilities is replaced with information theoretic methods, such as the Akaike Information Criterion and the Bayes Information Criterion. Applications to complex systems of coarse-grained molecular models approximating atomistic models of polyethylene materials are described. All of these model selection methods take into account uncertainties in the model, the observational data, the model parameters, and the predicted quantities of interest. A comparison of the models chosen by Bayesian model selection criteria and those chosen by the information-theoretic criteria is given.

  13. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    PubMed

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Understanding and Validity in Qualitative Research.

    ERIC Educational Resources Information Center

    Maxwell, Joseph A.

    1992-01-01

    Details the philosophical and practical dimensions of five types of validity used in qualitative research: descriptive, interpretive, theoretical, generalizable, and evaluative, with corresponding issues of understanding. Presents this typology as a checklist of the kinds of threats to validity that may arise. (SK)

  15. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  16. Health Professionals' Explanations of Suicidal Behaviour: Effects of Professional Group, Theoretical Intervention Model, and Patient Suicide Experience.

    PubMed

    Rothes, Inês Areal; Henriques, Margarida Rangel

    2017-12-01

    In a help relation with a suicidal person, the theoretical models of suicidality can be essential to guide the health professional's comprehension of the client/patient. The objectives of this study were to identify health professionals' explanations of suicidal behaviors and to study the effects of professional group, theoretical intervention models, and patient suicide experience in professionals' representations. Two hundred and forty-two health professionals filled out a self-report questionnaire. Exploratory principal components analysis was used. Five explanatory models were identified: psychological suffering, affective cognitive, sociocommunicational, adverse life events, and psychopathological. Results indicated that the psychological suffering and psychopathological models were the most valued by the professionals, while the sociocommunicational was seen as the least likely to explain suicidal behavior. Differences between professional groups were found. We concluded that training and reflection on theoretical models in general and in communicative issues in particular are needed in the education of health professionals.

  17. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  18. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  19. Modeling the economic impact of medication adherence in type 2 diabetes: a theoretical approach.

    PubMed

    Cobden, David S; Niessen, Louis W; Rutten, Frans Fh; Redekop, W Ken

    2010-09-07

    While strong correlations exist between medication adherence and health economic outcomes in type 2 diabetes, current economic analyses do not adequately consider them. We propose a new approach to incorporate adherence in cost-effectiveness analysis. We describe a theoretical approach to incorporating the effect of adherence when estimating the long-term costs and effectiveness of an antidiabetic medication. This approach was applied in a Markov model which includes common diabetic health states. We compared two treatments using hypothetical patient cohorts: injectable insulin (IDM) and oral (OAD) medications. Two analyses were performed, one which ignored adherence (analysis 1) and one which incorporated it (analysis 2). Results from the two analyses were then compared to explore the extent to which adherence may impact incremental cost-effectiveness ratios. In both analyses, IDM was more costly and more effective than OAD. When adherence was ignored, IDM generated an incremental cost-effectiveness of $12,097 per quality-adjusted life-year (QALY) gained versus OAD. Incorporation of adherence resulted in a slightly higher ratio ($16,241/QALY). This increase was primarily due to better adherence with OAD than with IDM, and the higher direct medical costs for IDM. Incorporating medication adherence into economic analyses can meaningfully influence the estimated cost-effectiveness of type 2 diabetes treatments, and should therefore be considered in health care decision-making. Future work on the impact of adherence on health economic outcomes, and validation of different approaches to modeling adherence, is warranted.

  20. Assessing the reliability and validity of anti-tobacco attitudes/beliefs in the context of a campaign strategy.

    PubMed

    Arheart, Kristopher L; Sly, David F; Trapido, Edward J; Rodriguez, Richard D; Ellestad, Amy J

    2004-11-01

    To identify multi-item attitude/belief scales associated with the theoretical foundations of an anti-tobacco counter-marketing campaign and assess their reliability and validity. The data analyzed are from two state-wide, random, cross-sectional telephone surveys [n(S1)=1,079, n(S2)=1,150]. Items forming attitude/belief scales are identified using factor analysis. Reliability is assessed with Chronbach's alpha. Relationships among scales are explored using Pearson correlation. Validity is assessed by testing associations derived from the Centers for Disease Control and Prevention's (CDC) logic model for tobacco control program development and evaluation linking media exposure to attitudes/beliefs, and attitudes/beliefs to smoking-related behaviors. Adjusted odds ratios are employed for these analyses. Three factors emerged: traditional attitudes/beliefs about tobacco and tobacco use, tobacco industry manipulation and anti-tobacco empowerment. Reliability coefficients are in the range of 0.70 and vary little between age groups. The factors are correlated with one-another as hypothesized. Associations between media exposure and the attitude/belief scales and between these scales and behaviors are consistent with the CDC logic model. Using reliable, valid multi-item scales is theoretically and methodologically more sound than employing single-item measures of attitudes/beliefs. Methodological, theoretical and practical implications are discussed.

  1. Calibration of Predictor Models Using Multiple Validation Experiments

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  2. Validation of the measure automobile emissions model : a statistical analysis

    DOT National Transportation Integrated Search

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  3. The interrogation decision-making model: A general theoretical framework for confessions.

    PubMed

    Yang, Yueran; Guyll, Max; Madon, Stephanie

    2017-02-01

    This article presents a new model of confessions referred to as the interrogation decision-making model . This model provides a theoretical umbrella with which to understand and analyze suspects' decisions to deny or confess guilt in the context of a custodial interrogation. The model draws upon expected utility theory to propose a mathematical account of the psychological mechanisms that not only underlie suspects' decisions to deny or confess guilt at any specific point during an interrogation, but also how confession decisions can change over time. Findings from the extant literature pertaining to confessions are considered to demonstrate how the model offers a comprehensive and integrative framework for organizing a range of effects within a limited set of model parameters. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  5. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  6. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  7. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  8. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  9. Development and validation of instrument for ergonomic evaluation of tablet arm chairs

    PubMed Central

    Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira

    2016-01-01

    The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099

  10. Electronic health record acceptance by physicians: testing an integrated theoretical model.

    PubMed

    Gagnon, Marie-Pierre; Ghandour, El Kebir; Talla, Pascaline Kengne; Simonyan, David; Godin, Gaston; Labrecque, Michel; Ouimet, Mathieu; Rousseau, Michel

    2014-04-01

    Several countries are in the process of implementing an Electronic Health Record (EHR), but limited physicians' acceptance of this technology presents a serious threat to its successful implementation. The aim of this study was to identify the main determinants of physician acceptance of EHR in a sample of general practitioners and specialists of the Province of Quebec (Canada). We sent an electronic questionnaire to physician members of the Quebec Medical Association. We tested four theoretical models (Technology acceptance model (TAM), Extended TAM, Psychosocial Model, and Integrated Model) using path analysis and multiple linear regression analysis in order to identify the main determinants of physicians' intention to use the EHR. We evaluated the modifying effect of sociodemographic characteristics using multi-group analysis of structural weights invariance. A total of 157 questionnaires were returned. The four models performed well and explained between 44% and 55% of the variance in physicians' intention to use the EHR. The Integrated model performed the best and showed that perceived ease of use, professional norm, social norm, and demonstrability of the results are the strongest predictors of physicians' intention to use the EHR. Age, gender, previous experience and specialty modified the association between those determinants and intention. The proposed integrated theoretical model is useful in identifying which factors could motivate physicians from different backgrounds to use the EHR. Physicians who perceive the EHR to be easy to use, coherent with their professional norms, supported by their peers and patients, and able to demonstrate tangible results are more likely to accept this technology. Age, gender, specialty and experience should also be taken into account when developing EHR implementation strategies targeting physicians. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Theoretically Guided Analytical Method Development and Validation for the Estimation of Rifampicin in a Mixture of Isoniazid and Pyrazinamide by UV Spectrophotometer

    PubMed Central

    Khan, Mohammad F.; Rita, Shamima A.; Kayser, Md. Shahidulla; Islam, Md. Shariful; Asad, Sharmeen; Bin Rashid, Ridwan; Bari, Md. Abdul; Rahman, Muhammed M.; Al Aman, D. A. Anwar; Setu, Nurul I.; Banoo, Rebecca; Rashid, Mohammad A.

    2017-01-01

    A simple, rapid, economic, accurate, and precise method for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometeric technique (guided by the theoretical investigation of physicochemical properties) was developed and validated. Theoretical investigations revealed that isoniazid and pyrazinamide both were freely soluble in water and slightly soluble in ethyl acetate whereas rifampicin was practically insoluble in water but freely soluble in ethyl acetate. This indicates that ethyl acetate is an effective solvent for the extraction of rifampicin from a water mixture of isoniazid and pyrazinamide. Computational study indicated that pH range of 6.0–8.0 would favor the extraction of rifampicin. Rifampicin is separated from isoniazid and pyrazinamide at pH 7.4 ± 0.1 by extracting with ethyl acetate. The ethyl acetate was then analyzed at λmax of 344.0 nm. The developed method was validated for linearity, accuracy and precision according to ICH guidelines. The proposed method exhibited good linearity over the concentration range of 2.5–35.0 μg/mL. The intraday and inter-day precision in terms of % RSD ranged from 1.09 to 1.70% and 1.63 to 2.99%, respectively. The accuracy (in terms of recovery) of the method varied from of 96.7 ± 0.9 to 101.1 ± 0.4%. The LOD and LOQ were found to be 0.83 and 2.52 μg/mL, respectively. In addition, the developed method was successfully applied to determine rifampicin combination (isoniazid and pyrazinamide) brands available in Bangladesh. PMID:28503547

  12. Theoretically guided analytical method development and validation for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometer

    NASA Astrophysics Data System (ADS)

    Khan, Mohammad F.; Rita, Shamima A.; Kayser, Md. Shahidulla; Islam, Md. Shariful; Asad, Sharmeen; Bin Rashid, Ridwan; Bari, Md. Abdul; Rahman, Muhammed M.; Al Aman, D. A. Anwar; Setu, Nurul I.; Banoo, Rebecca; Rashid, Mohammad A.

    2017-04-01

    A simple, rapid, economic, accurate and precise method for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometeric technique (guided by the theoretical investigation of physicochemical properties) was developed and validated. Theoretical investigations revealed that isoniazid and pyrazinamide both were freely soluble in water and slightly soluble in ethyl acetate whereas rifampicin was practically insoluble in water but freely soluble in ethyl acetate. This indicates that ethyl acetate is an effective solvent for the extraction of rifampicin from a water mixture of isoniazid and pyrazinamide. Computational study indicated that pH range of 6.0-8.0 would favor the extraction of rifampicin. Rifampicin is separated from isoniazid and pyrazinamide at pH 7.4 ± 0.1 by extracting with ethyl acetate. The ethyl acetate was then analyzed at λmax of 344.0 nm. The developed method was validated for linearity, accuracy and precision according to ICH guidelines. The proposed method exhibited good linearity over the concentration range of 2.5 - 35.0 µg/mL. The intraday and inter-day precision in terms of % RSD ranged from 1.09 - 1.70% and 1.63 - 2.99%, respectively. The accuracy (in terms of recovery) of the method varied from of 96.7 ± 0.9 to 101.1 ± 0.4%. The LOD and LOQ were found to be 0.83 and 2.52 µg/mL, respectively. In addition, the developed method was successfully applied to assay rifampicin combination (isoniazid and pyrazinamide) brands available in Bangladesh.

  13. Theoretically Guided Analytical Method Development and Validation for the Estimation of Rifampicin in a Mixture of Isoniazid and Pyrazinamide by UV Spectrophotometer.

    PubMed

    Khan, Mohammad F; Rita, Shamima A; Kayser, Md Shahidulla; Islam, Md Shariful; Asad, Sharmeen; Bin Rashid, Ridwan; Bari, Md Abdul; Rahman, Muhammed M; Al Aman, D A Anwar; Setu, Nurul I; Banoo, Rebecca; Rashid, Mohammad A

    2017-01-01

    A simple, rapid, economic, accurate, and precise method for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometeric technique (guided by the theoretical investigation of physicochemical properties) was developed and validated. Theoretical investigations revealed that isoniazid and pyrazinamide both were freely soluble in water and slightly soluble in ethyl acetate whereas rifampicin was practically insoluble in water but freely soluble in ethyl acetate. This indicates that ethyl acetate is an effective solvent for the extraction of rifampicin from a water mixture of isoniazid and pyrazinamide. Computational study indicated that pH range of 6.0-8.0 would favor the extraction of rifampicin. Rifampicin is separated from isoniazid and pyrazinamide at pH 7.4 ± 0.1 by extracting with ethyl acetate. The ethyl acetate was then analyzed at λ max of 344.0 nm. The developed method was validated for linearity, accuracy and precision according to ICH guidelines. The proposed method exhibited good linearity over the concentration range of 2.5-35.0 μg/mL. The intraday and inter-day precision in terms of % RSD ranged from 1.09 to 1.70% and 1.63 to 2.99%, respectively. The accuracy (in terms of recovery) of the method varied from of 96.7 ± 0.9 to 101.1 ± 0.4%. The LOD and LOQ were found to be 0.83 and 2.52 μg/mL, respectively. In addition, the developed method was successfully applied to determine rifampicin combination (isoniazid and pyrazinamide) brands available in Bangladesh.

  14. The Space Weather Modeling Framework (SWMF): Models and Validation

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV

    In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.

  15. CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment

    NASA Technical Reports Server (NTRS)

    Gaffney, Richard L., Jr.; Cutler, Andrew D.

    2005-01-01

    If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.

  16. Cross-validation of the osmotic pressure based on Pitzer model with air humidity osmometry at high concentration of ammonium sulfate solutions.

    PubMed

    Wang, Xiao-Lan; Zhan, Ting-Ting; Zhan, Xian-Cheng; Tan, Xiao-Ying; Qu, Xiao-You; Wang, Xin-Yue; Li, Cheng-Rong

    2014-01-01

    The osmotic pressure of ammonium sulfate solutions has been measured by the well-established freezing point osmometry in dilute solutions and we recently reported air humidity osmometry in a much wider range of concentration. Air humidity osmometry cross-validated the theoretical calculations of osmotic pressure based on the Pitzer model at high concentrations by two one-sided test (TOST) of equivalence with multiple testing corrections, where no other experimental method could serve as a reference for comparison. Although more strict equivalence criteria were established between the measurements of freezing point osmometry and the calculations based on the Pitzer model at low concentration, air humidity osmometry is the only currently available osmometry applicable to high concentration, serves as an economic addition to standard osmometry.

  17. A Conceptual Model of Career Development to Enhance Academic Motivation

    ERIC Educational Resources Information Center

    Collins, Nancy Creighton

    2010-01-01

    The purpose of this study was to develop, refine, and validate a conceptual model of career development to enhance the academic motivation of community college students. To achieve this end, a straw model was built from the theoretical and empirical research literature. The model was then refined and validated through three rounds of a Delphi…

  18. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    PubMed

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  19. Obesity in sub-Saharan Africa: development of an ecological theoretical framework.

    PubMed

    Scott, Alison; Ejikeme, Chinwe Stella; Clottey, Emmanuel Nii; Thomas, Joy Goens

    2013-03-01

    The prevalence of overweight and obesity is increasing in sub-Saharan Africa (SSA). There is a need for theoretical frameworks to catalyze further research and to inform the development of multi-level, context-appropriate interventions. In this commentary, we propose a preliminary ecological theoretical framework to conceptualize factors that contribute to increases in overweight and obesity in SSA. The framework is based on a Causality Continuum model [Coreil et al. Social and Behavioral Foundations of Public Health. Sage Publications, Thousand Oaks] that considers distant, intermediate and proximate influences. The influences incorporated in the model include globalization and urbanization as distant factors; occupation, social relationships, built environment and cultural perceptions of weight as intermediate factors and caloric intake, physical inactivity and genetics as proximate factors. The model illustrates the interaction of factors along a continuum, from the individual to the global marketplace, in shaping trends in overweight and obesity in SSA. The framework will be presented, each influence elucidated and implications for research and intervention development discussed. There is a tremendous need for further research on obesity in SSA. An improved evidence base will serve to validate and develop the proposed framework further.

  20. SU-F-J-41: Experimental Validation of a Cascaded Linear System Model for MVCBCT with a Multi-Layer EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Y; Rottmann, J; Myronakis, M

    2016-06-15

    Purpose: The purpose of this study was to validate the use of a cascaded linear system model for MV cone-beam CT (CBCT) using a multi-layer (MLI) electronic portal imaging device (EPID) and provide experimental insight into image formation. A validated 3D model provides insight into salient factors affecting reconstructed image quality, allowing potential for optimizing detector design for CBCT applications. Methods: A cascaded linear system model was developed to investigate the potential improvement in reconstructed image quality for MV CBCT using an MLI EPID. Inputs to the three-dimensional (3D) model include projection space MTF and NPS. Experimental validation was performedmore » on a prototype MLI detector installed on the portal imaging arm of a Varian TrueBeam radiotherapy system. CBCT scans of up to 898 projections over 360 degrees were acquired at exposures of 16 and 64 MU. Image volumes were reconstructed using a Feldkamp-type (FDK) filtered backprojection (FBP) algorithm. Flat field images and scans of a Catphan model 604 phantom were acquired. The effect of 2×2 and 4×4 detector binning was also examined. Results: Using projection flat fields as an input, examination of the modeled and measured NPS in the axial plane exhibits good agreement. Binning projection images was shown to improve axial slice SDNR by a factor of approximately 1.4. This improvement is largely driven by a decrease in image noise of roughly 20%. However, this effect is accompanied by a subsequent loss in image resolution. Conclusion: The measured axial NPS shows good agreement with the theoretical calculation using a linear system model. Binning of projection images improves SNR of large objects on the Catphan phantom by decreasing noise. Specific imaging tasks will dictate the implementation image binning to two-dimensional projection images. The project was partially supported by a grant from Varian Medical Systems, Inc. and grant No. R01CA188446-01 from the National Cancer

  1. Theoretical aspect of suitable spatial boundary condition specified for adjoint model on limited area

    NASA Astrophysics Data System (ADS)

    Wang, Yuan; Wu, Rongsheng

    2001-12-01

    Theoretical argumentation for so-called suitable spatial condition is conducted by the aid of homotopy framework to demonstrate that the proposed boundary condition does guarantee that the over-specification boundary condition resulting from an adjoint model on a limited-area is no longer an issue, and yet preserve its well-poseness and optimal character in the boundary setting. The ill-poseness of over-specified spatial boundary condition is in a sense, inevitable from an adjoint model since data assimilation processes have to adapt prescribed observations that used to be over-specified at the spatial boundaries of the modeling domain. In the view of pragmatic implement, the theoretical framework of our proposed condition for spatial boundaries indeed can be reduced to the hybrid formulation of nudging filter, radiation condition taking account of ambient forcing, together with Dirichlet kind of compatible boundary condition to the observations prescribed in data assimilation procedure. All of these treatments, no doubt, are very familiar to mesoscale modelers.

  2. A Validated Open-Source Multisolver Fourth-Generation Composite Femur Model.

    PubMed

    MacLeod, Alisdair R; Rose, Hannah; Gill, Harinderjit S

    2016-12-01

    Synthetic biomechanical test specimens are frequently used for preclinical evaluation of implant performance, often in combination with numerical modeling, such as finite-element (FE) analysis. Commercial and freely available FE packages are widely used with three FE packages in particular gaining popularity: abaqus (Dassault Systèmes, Johnston, RI), ansys (ANSYS, Inc., Canonsburg, PA), and febio (University of Utah, Salt Lake City, UT). To the best of our knowledge, no study has yet made a comparison of these three commonly used solvers. Additionally, despite the femur being the most extensively studied bone in the body, no freely available validated model exists. The primary aim of the study was primarily to conduct a comparison of mesh convergence and strain prediction between the three solvers (abaqus, ansys, and febio) and to provide validated open-source models of a fourth-generation composite femur for use with all the three FE packages. Second, we evaluated the geometric variability around the femoral neck region of the composite femurs. Experimental testing was conducted using fourth-generation Sawbones® composite femurs instrumented with strain gauges at four locations. A generic FE model and four specimen-specific FE models were created from CT scans. The study found that the three solvers produced excellent agreement, with strain predictions being within an average of 3.0% for all the solvers (r2 > 0.99) and 1.4% for the two commercial codes. The average of the root mean squared error against the experimental results was 134.5% (r2 = 0.29) for the generic model and 13.8% (r2 = 0.96) for the specimen-specific models. It was found that composite femurs had variations in cortical thickness around the neck of the femur of up to 48.4%. For the first time, an experimentally validated, finite-element model of the femur is presented for use in three solvers. This model is freely available online along with all the supporting validation data.

  3. Validation of a partial coherence interferometry method for estimating retinal shape

    PubMed Central

    Verkicharla, Pavan K.; Suheimat, Marwan; Pope, James M.; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L.; Atchison, David A.

    2015-01-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data. PMID:26417496

  4. Validation of a partial coherence interferometry method for estimating retinal shape.

    PubMed

    Verkicharla, Pavan K; Suheimat, Marwan; Pope, James M; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L; Atchison, David A

    2015-09-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data.

  5. Education, Labour Market and Human Capital Models: Swedish Experiences and Theoretical Analyses.

    ERIC Educational Resources Information Center

    Sohlman, Asa

    An empirical study concerning development of the Swedish educational system from a labor market point of view, and a theoretical study on human capital models are discussed. In "Education and Labour Market; The Swedish Experience 1900-1975," attention is directed to the following concerns: the official educational policy regarding…

  6. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    PubMed

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  7. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    PubMed

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  8. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  9. KINEROS2/AGWA: Model use, calibration and validation

    USGS Publications Warehouse

    Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.

    2012-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  10. [Establishment of the mathematic model of total quantum statistical moment standard similarity for application to medical theoretical research].

    PubMed

    He, Fu-yuan; Deng, Kai-wen; Huang, Sheng; Liu, Wen-long; Shi, Ji-lian

    2013-09-01

    The paper aims to elucidate and establish a new mathematic model: the total quantum statistical moment standard similarity (TQSMSS) on the base of the original total quantum statistical moment model and to illustrate the application of the model to medical theoretical research. The model was established combined with the statistical moment principle and the normal distribution probability density function properties, then validated and illustrated by the pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical method for them, and by analysis of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving the Buyanghanwu-decoction extract. The established model consists of four mainly parameters: (1) total quantum statistical moment similarity as ST, an overlapped area by two normal distribution probability density curves in conversion of the two TQSM parameters; (2) total variability as DT, a confidence limit of standard normal accumulation probability which is equal to the absolute difference value between the two normal accumulation probabilities within integration of their curve nodical; (3) total variable probability as 1-Ss, standard normal distribution probability within interval of D(T); (4) total variable probability (1-beta)alpha and (5) stable confident probability beta(1-alpha): the correct probability to make positive and negative conclusions under confident coefficient alpha. With the model, we had analyzed the TQSMS similarities of pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical methods for them were at range of 0.3852-0.9875 that illuminated different pharmacokinetic behaviors of each other; and the TQSMS similarities (ST) of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving Buyanghuanwu-decoction-extract were at range of 0.6842-0.999 2 that showed different constituents

  11. Validation of elk resource selection models with spatially independent data

    Treesearch

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  12. A validated approach for modeling collapse of steel structures

    NASA Astrophysics Data System (ADS)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  13. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  14. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  15. A Historical Forcing Ice Sheet Model Validation Framework for Greenland

    NASA Astrophysics Data System (ADS)

    Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.

    2014-12-01

    We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.

  16. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  17. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    DTIC Science & Technology

    2015-12-01

    distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and

  18. Using Mathematics, Mathematical Applications, Mathematical Modelling, and Mathematical Literacy: A Theoretical Study

    ERIC Educational Resources Information Center

    Mumcu, Hayal Yavuz

    2016-01-01

    The purpose of this theoretical study is to explore the relationships between the concepts of using mathematics in the daily life, mathematical applications, mathematical modelling, and mathematical literacy. As these concepts are generally taken as independent concepts in the related literature, they are confused with each other and it becomes…

  19. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    PubMed Central

    Lafave, Mark R.; Butterwick, Dale; Eubank, Breda

    2015-01-01

    Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897

  20. Validation of the World Health Organization's Quality of Life Questionnaire with parents of children with autistic disorder.

    PubMed

    Dardas, Latefa A; Ahmad, Muayyad M

    2014-09-01

    The World Health Organization's Quality of Life Questionnaire-BREF (WHOQOL-BREF) has been used in many studies that target parents of children with Autistic Disorder. However, the measure has yet to be validated and adapted to this sample group whose daily experiences are considered substantially different from those of parents of children with typical development and parents of children with other disabilities. Therefore, this study was designed to examine the psychometric properties and the theoretical structure of the WHOQOL-BREF with a sample of 184 parents of children with Autistic Disorder. The factor structure for the WHOQOL-BREF was examined using exploratory and confirmatory factor analyses. Our analyses provided no evidence of a better model than the original 4-domain model. Nevertheless, some items in the measure were re-distributed to different domains based on theoretical meanings and/or clean loading criteria. The new model structure gained the measure's required validity with parents of children with Autistic Disorder.

  1. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  2. Item Development and Validity Testing for a Self- and Proxy Report: The Safe Driving Behavior Measure

    PubMed Central

    Classen, Sherrilene; Winter, Sandra M.; Velozo, Craig A.; Bédard, Michel; Lanford, Desiree N.; Brumback, Babette; Lutz, Barbara J.

    2010-01-01

    OBJECTIVE We report on item development and validity testing of a self-report older adult safe driving behaviors measure (SDBM). METHOD On the basis of theoretical frameworks (Precede–Proceed Model of Health Promotion, Haddon’s matrix, and Michon’s model), existing driving measures, and previous research and guided by measurement theory, we developed items capturing safe driving behavior. Item development was further informed by focus groups. We established face validity using peer reviewers and content validity using expert raters. RESULTS Peer review indicated acceptable face validity. Initial expert rater review yielded a scale content validity index (CVI) rating of 0.78, with 44 of 60 items rated ≥0.75. Sixteen unacceptable items (≤0.5) required major revision or deletion. The next CVI scale average was 0.84, indicating acceptable content validity. CONCLUSION The SDBM has relevance as a self-report to rate older drivers. Future pilot testing of the SDBM comparing results with on-road testing will define criterion validity. PMID:20437917

  3. Individual differences in processing styles: validity of the Rational-Experiential Inventory.

    PubMed

    Björklund, Fredrik; Bäckström, Martin

    2008-10-01

    In Study 1 (N= 203) the factor structure of a Swedish translation of Pacini and Epstein's Rational-Experiential Inventory (REI-40) was investigated using confirmatory factor analysis. The hypothesized model with rationality and experientiality as orthogonal factors had satisfactory fit to the data, significantly better than alternative models (with two correlated factors or a single factor). Inclusion of "ability" and "favorability" subscales for rationality and experientiality increased fit further. It was concluded that the structural validity of the REI is adequate. In Study 2 (N= 72) the REI-factors were shown to have theoretically meaningful correlations to other personality traits, indicating convergent and discriminant validity. Finally, scores on the rationality scale were negatively related to risky choice framing effects in Kahneman and Tversky's Asian disease task, indicating concurrent validity. On the basis of these findings it was concluded that the test has satisfactory psychometric properties.

  4. Development and Validation of an Observation System for Analyzing Teaching Roles.

    ERIC Educational Resources Information Center

    Southwell, Reba K.; Webb, Jeaninne N.

    The construction and validation of a theoretically based sign system for the analysis of teaching roles in childhood education is described. A theoretical and empirical approach to validation were developed. In the first, the general concept of teacher role was identified as a viable construct for investigating characteristic patterns of classroom…

  5. [Social determinants of odontalgia in epidemiological studies: theoretical review and proposed conceptual model].

    PubMed

    Bastos, João Luiz Dornelles; Gigante, Denise Petrucci; Peres, Karen Glazer; Nedel, Fúlvio Borges

    2007-01-01

    The epidemiological literature has been limited by the absence of a theoretical framework reflecting the complexity of causal mechanisms for the occurrence of health phenomena / disease conditions. In the field of oral epidemiology, such lack of theory also prevails, since dental caries the leading topic in oral research has been often studied through a biological and reductionist viewpoint. One of the most important consequences of dental caries is dental pain (odontalgia), which has received little attention in studies with sophisticated theoretical models and powerful designs to establish causal relationships. The purpose of this study is to review the scientific literature on the determinants of odontalgia and to discuss theories proposed for the explanation of the phenomenon. Conceptual models and emerging theories on the social determinants of oral health are revised, in an attempt to build up links with the bio-psychosocial pain model, proposing a more elaborate causal model for odontalgia. The framework suggests causal pathways between social structure and oral health through material, psychosocial and behavioral pathways. Aspects of the social structure are highlighted in order to relate them to odontalgia, stressing their importance in discussions of causal relationships in oral health research.

  6. I-15 San Diego, California, model validation and calibration report.

    DOT National Transportation Integrated Search

    2010-02-01

    The Integrated Corridor Management (ICM) initiative requires the calibration and validation of simulation models used in the Analysis, Modeling, and Simulation of Pioneer Site proposed integrated corridors. This report summarizes the results and proc...

  7. Animal models of binge drinking, current challenges to improve face validity.

    PubMed

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  9. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamison, Ryan Dale; Buchheit, Thomas E.; Emery, John M

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element modelmore » of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.« less

  10. Toward Validation of the Genius Discipline-Specific Literacy Model

    ERIC Educational Resources Information Center

    Ellis, Edwin S.; Wills, Stephen; Deshler, Donald D.

    2011-01-01

    An analysis of the rationale and theoretical foundations of the Genius Discipline-specific Literacy Model and its use of SMARTvisuals to cue information-processing skills and strategies and focus attention on essential informational elements in high-frequency topics in history and the English language arts are presented. Quantitative data…

  11. External validation of EPIWIN biodegradation models.

    PubMed

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  12. Exploring the relationship between volunteering and hospice sustainability in the UK: a theoretical model.

    PubMed

    Scott, Ros; Jindal-Snape, Divya; Manwaring, Gaye

    2018-05-02

    To explore the relationship between volunteering and the sustainability of UK voluntary hospices. A narrative literature review was conducted to inform the development of a theoretical model. Eight databases were searched: CINAHL (EBSCO), British Nursing Index, Intute: Health and Life Sciences, ERIC, SCOPUS, ASSIA (CSA), Cochrane Library and Google Scholar. A total of 90 documents were analysed. Emerging themes included the importance of volunteering to the hospice economy and workforce, the quality of services, and public and community support. Findings suggest that hospice sustainability is dependent on volunteers; however, the supply and retention of volunteers is affected by internal and external factors. A theoretical model was developed to illustrate the relationship between volunteering and hospice sustainability. It demonstrates the factors necessary for hospice sustainability and the reciprocal impact that these factors and volunteering have on each other. The model has a practical application as an assessment framework and strategic planning tool.

  13. Ventilation tube insertion simulation: a literature review and validity assessment of five training models.

    PubMed

    Mahalingam, S; Awad, Z; Tolley, N S; Khemani, S

    2016-08-01

    The objective of this study was to identify and investigate the face and content validity of ventilation tube insertion (VTI) training models described in the literature. A review of literature was carried out to identify articles describing VTI simulators. Feasible models were replicated and assessed by a group of experts. Postgraduate simulation centre. Experts were defined as surgeons who had performed at least 100 VTI on patients. Seventeen experts were participated ensuring sufficient statistical power for analysis. A standardised 18-item Likert-scale questionnaire was used. This addressed face validity (realism), global and task-specific content (suitability of the model for teaching) and curriculum recommendation. The search revealed eleven models, of which only five had associated validity data. Five models were found to be feasible to replicate. None of the tested models achieved face or global content validity. Only one model achieved task-specific validity, and hence, there was no agreement on curriculum recommendation. The quality of simulation models is moderate and there is room for improvement. There is a need for new models to be developed or existing ones to be refined in order to construct a more realistic training platform for VTI simulation. © 2015 John Wiley & Sons Ltd.

  14. A Validity Agenda for Growth Models: One Size Doesn't Fit All!

    ERIC Educational Resources Information Center

    Patelis, Thanos

    2012-01-01

    This is a keynote presentation given at AERA on developing a validity agenda for growth models in a large scale (e.g., state) setting. The emphasis of this presentation was to indicate that growth models and the validity agenda designed to provide evidence in supporting the claims to be made need to be personalized to meet the local or…

  15. [Comparison of the Wechsler Memory Scale-III and the Spain-Complutense Verbal Learning Test in acquired brain injury: construct validity and ecological validity].

    PubMed

    Luna-Lario, P; Pena, J; Ojeda, N

    2017-04-16

    To perform an in-depth examination of the construct validity and the ecological validity of the Wechsler Memory Scale-III (WMS-III) and the Spain-Complutense Verbal Learning Test (TAVEC). The sample consists of 106 adults with acquired brain injury who were treated in the Area of Neuropsychology and Neuropsychiatry of the Complejo Hospitalario de Navarra and displayed memory deficit as the main sequela, measured by means of specific memory tests. The construct validity is determined by examining the tasks required in each test over the basic theoretical models, comparing the performance according to the parameters offered by the tests, contrasting the severity indices of each test and analysing their convergence. The external validity is explored through the correlation between the tests and by using regression models. According to the results obtained, both the WMS-III and the TAVEC have construct validity. The TAVEC is more sensitive and captures not only the deficits in mnemonic consolidation, but also in the executive functions involved in memory. The working memory index of the WMS-III is useful for predicting the return to work at two years after the acquired brain injury, but none of the instruments anticipates the disability and dependence at least six months after the injury. We reflect upon the construct validity of the tests and their insufficient capacity to predict functionality when the sequelae become chronic.

  16. Theoretical model to explain the problem-solving process in physics

    NASA Astrophysics Data System (ADS)

    Lopez, Carlos

    2011-03-01

    This work reports a theoretical model developed with the aim to explain the mental mechanisms of knowledge building during the problem-solving process in physics using a hybrid approach of assimilation- formation of concepts. The model has been termed conceptual chains and represents graphic diagrams of conceptual dependency, which have yielded information about the background knowledge required during the learning process, as well as about the formation of diverse structures that correspond to distinct forms of networking concepts Additionally, the conceptual constructs of the model have been classified according to five types of knowledge. Evidence was found about the influence of these structures, as well as of the distinct types of knowledge about the degree of difficulty of the problems. I want to be grateful to Laureate International Universities, Baltimore M.D., USA, for the financing granted for the accomplishment of this work.

  17. Theoretical results on the tandem junction solar cell based on its Ebers-Moll transistor model

    NASA Technical Reports Server (NTRS)

    Goradia, C.; Vaughn, J.; Baraona, C. R.

    1980-01-01

    A one-dimensional theoretical model of the tandem junction solar cell (TJC) with base resistivity greater than about 1 ohm-cm and under low level injection has been derived. This model extends a previously published conceptual model which treats the TJC as an npn transistor. The model gives theoretical expressions for each of the Ebers-Moll type currents of the illuminated TJC and allows for the calculation of the spectral response, I(sc), V(oc), FF and eta under variation of one or more of the geometrical and material parameters and 1MeV electron fluence. Results of computer calculations based on this model are presented and discussed. These results indicate that for space applications, both a high beginning of life efficiency, greater than 15% AM0, and a high radiation tolerance can be achieved only with thin (less than 50 microns) TJC's with high base resistivity (greater than 10 ohm-cm).

  18. Comparison of statistical and theoretical habitat models for conservation planning: the benefit of ensemble prediction

    USGS Publications Warehouse

    Jones-Farrand, D. Todd; Fearer, Todd M.; Thogmartin, Wayne E.; Thompson, Frank R.; Nelson, Mark D.; Tirpak, John M.

    2011-01-01

    Selection of a modeling approach is an important step in the conservation planning process, but little guidance is available. We compared two statistical and three theoretical habitat modeling approaches representing those currently being used for avian conservation planning at landscape and regional scales: hierarchical spatial count (HSC), classification and regression tree (CRT), habitat suitability index (HSI), forest structure database (FS), and habitat association database (HA). We focused our comparison on models for five priority forest-breeding species in the Central Hardwoods Bird Conservation Region: Acadian Flycatcher, Cerulean Warbler, Prairie Warbler, Red-headed Woodpecker, and Worm-eating Warbler. Lacking complete knowledge on the distribution and abundance of each species with which we could illuminate differences between approaches and provide strong grounds for recommending one approach over another, we used two approaches to compare models: rank correlations among model outputs and comparison of spatial correspondence. In general, rank correlations were significantly positive among models for each species, indicating general agreement among the models. Worm-eating Warblers had the highest pairwise correlations, all of which were significant (P , 0.05). Red-headed Woodpeckers had the lowest agreement among models, suggesting greater uncertainty in the relative conservation value of areas within the region. We assessed model uncertainty by mapping the spatial congruence in priorities (i.e., top ranks) resulting from each model for each species and calculating the coefficient of variation across model ranks for each location. This allowed identification of areas more likely to be good targets of conservation effort for a species, those areas that were least likely, and those in between where uncertainty is higher and thus conservation action incorporates more risk. Based on our results, models developed independently for the same purpose

  19. Objective validation of central sensitization in the rat UVB and heat rekindling model

    PubMed Central

    Weerasinghe, NS; Lumb, BM; Apps, R; Koutsikou, S; Murrell, JC

    2014-01-01

    Background The UVB and heat rekindling (UVB/HR) model shows potential as a translatable inflammatory pain model. However, the occurrence of central sensitization in this model, a fundamental mechanism underlying chronic pain, has been debated. Face, construct and predictive validity are key requisites of animal models; electromyogram (EMG) recordings were utilized to objectively demonstrate validity of the rat UVB/HR model. Methods The UVB/HR model was induced on the heel of the hind paw under anaesthesia. Mechanical withdrawal thresholds (MWTs) were obtained from biceps femoris EMG responses to a gradually increasing pinch at the mid hind paw region under alfaxalone anaesthesia, 96 h after UVB irradiation. MWT was compared between UVB/HR and SHAM-treated rats (anaesthetic only). Underlying central mechanisms in the model were pharmacologically validated by MWT measurement following intrathecal N-methyl-d-aspartate (NMDA) receptor antagonist, MK-801, or saline. Results Secondary hyperalgesia was confirmed by a significantly lower pre-drug MWT {mean [±standard error of the mean (SEM)]} in UVB/HR [56.3 (±2.1) g/mm2, n = 15] compared with SHAM-treated rats [69.3 (±2.9) g/mm2, n = 8], confirming face validity of the model. Predictive validity was demonstrated by the attenuation of secondary hyperalgesia by MK-801, where mean (±SEM) MWT was significantly higher [77.2 (±5.9) g/mm2 n = 7] in comparison with pre-drug [57.8 (±3.5) g/mm2 n = 7] and saline [57.0 (±3.2) g/mm2 n = 8] at peak drug effect. The occurrence of central sensitization confirmed construct validity of the UVB/HR model. Conclusions This study used objective outcome measures of secondary hyperalgesia to validate the rat UVB/HR model as a translational model of inflammatory pain. What's already known about this topic? Most current animal chronic pain models lack translatability to human subjects. Primary hyperalgesia is an established feature of the UVB/heat rekindling

  20. The Roy Adaptation Model: A Theoretical Framework for Nurses Providing Care to Individuals With Anorexia Nervosa.

    PubMed

    Jennings, Karen M

    Using a nursing theoretical framework to understand, elucidate, and propose nursing research is fundamental to knowledge development. This article presents the Roy Adaptation Model as a theoretical framework to better understand individuals with anorexia nervosa during acute treatment, and the role of nursing assessments and interventions in the promotion of weight restoration. Nursing assessments and interventions situated within the Roy Adaptation Model take into consideration how weight restoration does not occur in isolation but rather reflects an adaptive process within external and internal environments, and has the potential for more holistic care.

  1. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    PubMed

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  2. Game-Theoretic Models of Information Overload in Social Networks

    NASA Astrophysics Data System (ADS)

    Borgs, Christian; Chayes, Jennifer; Karrer, Brian; Meeder, Brendan; Ravi, R.; Reagans, Ray; Sayedi, Amin

    We study the effect of information overload on user engagement in an asymmetric social network like Twitter. We introduce simple game-theoretic models that capture rate competition between celebrities producing updates in such networks where users non-strategically choose a subset of celebrities to follow based on the utility derived from high quality updates as well as disutility derived from having to wade through too many updates. Our two variants model the two behaviors of users dropping some potential connections (followership model) or leaving the network altogether (engagement model). We show that under a simple formulation of celebrity rate competition, there is no pure strategy Nash equilibrium under the first model. We then identify special cases in both models when pure rate equilibria exist for the celebrities: For the followership model, we show existence of a pure rate equilibrium when there is a global ranking of the celebrities in terms of the quality of their updates to users. This result also generalizes to the case when there is a partial order consistent with all the linear orders of the celebrities based on their qualities to the users. Furthermore, these equilibria can be computed in polynomial time. For the engagement model, pure rate equilibria exist when all users are interested in the same number of celebrities, or when they are interested in at most two. Finally, we also give a finite though inefficient procedure to determine if pure equilibria exist in the general case of the followership model.

  3. Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

  4. The Construct Validity of Attitudes toward Career Counseling Scale for Korean College Students

    ERIC Educational Resources Information Center

    Nam, Suk Kyung; In Park, Hyung

    2015-01-01

    This study aimed to examine the construct validity of the Attitudes Toward Career Counseling Scale (ATCCS) in Korea. In Study 1, confirmatory factor analysis (CFA) was used for testing the factor structure of the scale. The results supported a two-factor (value and stigma) model, which was theoretically driven from the original study. Results of…

  5. Development and validation of a cost-utility model for Type 1 diabetes mellitus.

    PubMed

    Wolowacz, S; Pearson, I; Shannon, P; Chubb, B; Gundgaard, J; Davies, M; Briggs, A

    2015-08-01

    To develop a health economic model to evaluate the cost-effectiveness of new interventions for Type 1 diabetes mellitus by their effects on long-term complications (measured through mean HbA1c ) while capturing the impact of treatment on hypoglycaemic events. Through a systematic review, we identified complications associated with Type 1 diabetes mellitus and data describing the long-term incidence of these complications. An individual patient simulation model was developed and included the following complications: cardiovascular disease, peripheral neuropathy, microalbuminuria, end-stage renal disease, proliferative retinopathy, ketoacidosis, cataract, hypoglycemia and adverse birth outcomes. Risk equations were developed from published cumulative incidence data and hazard ratios for the effect of HbA1c , age and duration of diabetes. We validated the model by comparing model predictions with observed outcomes from studies used to build the model (internal validation) and from other published data (external validation). We performed illustrative analyses for typical patient cohorts and a hypothetical intervention. Model predictions were within 2% of expected values in the internal validation and within 8% of observed values in the external validation (percentages represent absolute differences in the cumulative incidence). The model utilized high-quality, recent data specific to people with Type 1 diabetes mellitus. In the model validation, results deviated less than 8% from expected values. © 2014 Research Triangle Institute d/b/a RTI Health Solutions. Diabetic Medicine © 2014 Diabetes UK.

  6. Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency

    NASA Astrophysics Data System (ADS)

    Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.

    2013-09-01

    A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.

  7. Validation of a Global Ionospheric Data Assimilation Model

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Hajj, G.; Wang, C.; Pi, X.; Rosen, I.

    2003-04-01

    As the number of ground and space-based receivers tracking the global positioning system (GPS) steadily increases, and the quantity of other ionospheric remote sensing data such as measurements of airglow also increases, it is becoming possible to monitor changes in the ionosphere continuously and on a global scale with unprecedented accuracy and reliability. However, in order to make effective use of such a large volume of data for both ionospheric specification and forecast, it is important to develop a data- driven ionospheric model that is consistent with the underlying physical principles governing ionosphere dynamics. A fully 3-dimensional Global Assimilative Ionosphere Model (GAIM) is currently being developed by a joint University of Southern California and Jet Propulsion Laboratory team. GAIM uses a first-principles ionospheric physics model (“forward” model) and Kalman filtering and 4DVAR techniques to not only solve for densities on a 3D grid but also estimate key driving forces which are inputs to the theoretical model, such as the ExB drift, neutral wind, and production terms. The driving forces are estimated using an “adjoint equation” to compute the required partial derivatives, thereby greatly reducing the computational demands compared to other techniques. For estimation of the grid densities, GAIM uses an approximate Kalman filter implementation in which the portions of the covariance matrix that are retained (the off-diagonal elements) are determined by assumed but physical correlation lengths in the ionosphere. By selecting how sparse or full the covariance matrix is over repeated Kalman filter runs, one can fully investigate the tradeoff between estimation accuracy and computational speed. Although GAIM will ultimately use multiple datatypes and many data sources, we have performed a first study of quantitative accuracy by ingesting GPS-derived TEC observations from ground and space-based receivers and nighttime UV radiance data from

  8. Affective Change in Psychodynamic Psychotherapy: Theoretical Models and Clinical Approaches to Changing Emotions.

    PubMed

    Subic-Wrana, Claudia; Greenberg, Leslie S; Lane, Richard D; Michal, Matthias; Wiltink, Jörg; Beutel, Manfred E

    2016-09-01

    Affective change has been considered the hallmark of therapeutic change in psychoanalysis. Psychoanalytic writers have begun to incorporate theoretically the advanced understanding of emotional processing and transformation of the affective neurosciences. We ask if this theoretical advancement is reflected in treatment techniques addressing the processing of emotion. We review psychoanalytic models and treatment recommendations of maladaptive affect processing in the light of a neuroscientifically informed model of achieving psychotherapeutic change by activation and reconsolidation of emotional memory. Emotions tend to be treated as other mental contents, resulting in a lack of specific psychodynamic techniques to work with emotions. Manualized technical modifications addressing affect regulation have been successfully tested in patients with personality pathology, but not for psychodynamic treatments of axis I disorders. Emotional memories need to be activated in order to be modified, therefore, we propose to include techniques into psychodynamic therapy that stimulate emotional experience.

  9. Modern modeling techniques had limited external validity in predicting mortality from traumatic brain injury.

    PubMed

    van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W

    2016-10-01

    Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  11. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  12. Mechanisms of plasma-assisted catalyzed growth of carbon nanofibres: a theoretical modeling

    NASA Astrophysics Data System (ADS)

    Gupta, R.; Sharma, S. C.; Sharma, R.

    2017-02-01

    A theoretical model is developed to study the nucleation and catalytic growth of carbon nanofibers (CNFs) in a plasma environment. The model includes the charging of CNFs, the kinetics of the plasma species (neutrals, ions and electrons), plasma pretreatment of the catalyst film, and various processes unique to a plasma-exposed catalyst surface such as adsorption of neutrals, thermal dissociation of neutrals, ion induced dissociation, interaction between neutral species, stress exerted by the growing graphene layers and the growth of CNFs. Numerical calculations are carried out for typical glow discharge plasma parameters. It is found that the growth rate of CNFs decreases with the catalyst nanoparticle size. In addition, the effect of hydrogen on the catalyst nanoparticle size, CNF tip diameter, CNF growth rate, and the tilt angle of the graphene layers to the fiber axis are investigated. Moreover, it is also found that the length of CNFs increases with hydrocarbon number density. Our theoretical findings are in good agreement with experimental observations and can be extended to enhance the field emission characteristics of CNFs.

  13. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  14. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    PubMed

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  15. Modeling Theory of Mind and Cognitive Appraisal with Decision-Theoretic Agents

    DTIC Science & Technology

    2011-04-07

    following key factors: Consistency: People expect, prefer, and are driven to maintain consistency, and avoid cognitive dissonance , be- tween beliefs...Modeling Theory of Mind and Cognitive Appraisal with Decision-Theoretic Agents David V. Pynadath1, Mei Si2, and Stacy C. Marsella1 1Institute for...capacity in appraisal and social emotions, as well as arguing for a uniform process for emotion and cognition . 1 Report Documentation Page Form

  16. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  17. MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES

    EPA Science Inventory

    This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

  18. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    PubMed

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  20. Validation of the Mindful Coping Scale

    ERIC Educational Resources Information Center

    Tharaldsen, Kjersti B.; Bru, Edvin

    2011-01-01

    The aim of this research is to develop and validate a self-report measure of mindfulness and coping, the mindful coping scale (MCS). Dimensions of mindful coping were theoretically deduced from mindfulness theory and coping theory. The MCS was empirically evaluated by use of factor analyses, reliability testing and nomological network validation.…

  1. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    PubMed

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. U.S. 75 Dallas, Texas, Model Validation and Calibration Report

    DOT National Transportation Integrated Search

    2010-02-01

    This report presents the model validation and calibration results of the Integrated Corridor Management (ICM) analysis, modeling, and simulation (AMS) for the U.S. 75 Corridor in Dallas, Texas. The purpose of the project was to estimate the benefits ...

  3. An assessment of some theoretical models used for the calculation of the refractive index of InXGa1-xAs

    NASA Astrophysics Data System (ADS)

    Engelbrecht, J. A. A.

    2018-04-01

    Theoretical models used for the determination of the refractive index of InXGa1-XAs are reviewed and compared. Attention is drawn to some problems experienced with some of the models. Models also extended to the mid-infrared region of the electromagnetic spectrum. Theoretical results in the mid-infrared region are then compared to previously published experimental results.

  4. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    PubMed

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  5. Independent external validation of predictive models for urinary dysfunction following external beam radiotherapy of the prostate: Issues in model development and reporting.

    PubMed

    Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W

    2016-08-01

    Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. On the Usefulness of Narratives: An Interdisciplinary Review and Theoretical Model.

    PubMed

    Shaffer, Victoria A; Focella, Elizabeth S; Hathaway, Andrew; Scherer, Laura D; Zikmund-Fisher, Brian J

    2018-04-19

    How can we use stories from other people to promote better health experiences, improve judgments about health, and increase the quality of medical decisions without introducing bias, systematically persuading the listeners to change their attitudes, or altering behaviors in nonoptimal ways? More practically, should narratives be used in health education, promotion, or behavior change interventions? In this article, we address these questions by conducting a narrative review of a diverse body of literature on narratives from several disciplines to gain a better understanding about what narratives do, including their role in communication, engagement, recall, persuasion, and health behavior change. We also review broad theories about information processing and persuasion from psychology and more specific models about narrative messaging found in the health communication and marketing literatures to provide insight into the processes by which narratives have their effect on health behavior. To address major gaps in our theoretical understanding about how narratives work and what effects they will have on health behavior, we propose the Narrative Immersion Model, whose goal is to identify the parameters that predict the specific impact of a particular narrative (e.g. persuade, inform, comfort, etc.) based on the type of narrative message (e.g. process, experience, or outcome narrative). Further, the Narrative Immersion Model describes the magnitude of the effect as increasing through successive layers of engagement with the narrative: interest, identification, and immersion. Finally, the Narrative Immersion Model identifies characteristics of the narrative intervention that encourage greater immersion within a given narrative. We believe there are important communication gaps in areas areas of behavioral medicine that could be addressed with narratives; however, more work is needed in order to employ narrative messaging systematically. The Narrative Immersion Model

  7. A New Theoretical Approach to Postsecondary Student Disability: Disability-Diversity (Dis)Connect Model

    ERIC Educational Resources Information Center

    Aquino, Katherine C.

    2016-01-01

    Disability is often viewed as an obstacle to postsecondary inclusion, but not a characteristic of student diversity. Additionally, current theoretical frameworks isolate disability from other student diversity characteristics. In response, a new conceptual framework, the Disability-Diversity (Dis)Connect Model (DDDM), was created to address…

  8. Development of theoretical models of integrated millimeter wave antennas

    NASA Technical Reports Server (NTRS)

    Yngvesson, K. Sigfrid; Schaubert, Daniel H.

    1991-01-01

    Extensive radiation patterns for Linear Tapered Slot Antenna (LTSA) Single Elements are presented. The directivity of LTSA elements is predicted correctly by taking the cross polarized pattern into account. A moment method program predicts radiation patterns for air LTSAs with excellent agreement with experimental data. A moment method program was also developed for the task LTSA Array Modeling. Computations performed with this program are in excellent agreement with published results for dipole and monopole arrays, and with waveguide simulator experiments, for more complicated structures. Empirical modeling of LTSA arrays demonstrated that the maximum theoretical element gain can be obtained. Formulations were also developed for calculating the aperture efficiency of LTSA arrays used in reflector systems. It was shown that LTSA arrays used in multibeam systems have a considerable advantage in terms of higher packing density, compared with waveguide feeds. Conversion loss of 10 dB was demonstrated at 35 GHz.

  9. Modeling the effects of argument length and validity on inductive and deductive reasoning.

    PubMed

    Rotello, Caren M; Heit, Evan

    2009-09-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.

  10. Droplet size in flow: Theoretical model and application to polymer blends

    NASA Astrophysics Data System (ADS)

    Fortelný, Ivan; Jůza, Josef

    2017-05-01

    The paper is focused on prediction of the average droplet radius, R, in flowing polymer blends where the droplet size is determined by dynamic equilibrium between the droplet breakup and coalescence. Expressions for the droplet breakup frequency in systems with low and high contents of the dispersed phase are derived using available theoretical and experimental results for model blends. Dependences of the coalescence probability, Pc, on system parameters, following from recent theories, is considered and approximate equation for Pc in a system with a low polydispersity in the droplet size is proposed. Equations for R in systems with low and high contents of the dispersed phase are derived. Combination of these equations predicts realistic dependence of R on the volume fraction of dispersed droplets, φ. Theoretical prediction of the ratio of R to the critical droplet radius at breakup agrees fairly well with experimental values for steadily mixed polymer blends.

  11. On Utilizing Optimal and Information Theoretic Syntactic Modeling for Peptide Classification

    NASA Astrophysics Data System (ADS)

    Aygün, Eser; Oommen, B. John; Cataltepe, Zehra

    Syntactic methods in pattern recognition have been used extensively in bioinformatics, and in particular, in the analysis of gene and protein expressions, and in the recognition and classification of bio-sequences. These methods are almost universally distance-based. This paper concerns the use of an Optimal and Information Theoretic (OIT) probabilistic model [11] to achieve peptide classification using the information residing in their syntactic representations. The latter has traditionally been achieved using the edit distances required in the respective peptide comparisons. We advocate that one can model the differences between compared strings as a mutation model consisting of random Substitutions, Insertions and Deletions (SID) obeying the OIT model. Thus, in this paper, we show that the probability measure obtained from the OIT model can be perceived as a sequence similarity metric, using which a Support Vector Machine (SVM)-based peptide classifier, referred to as OIT_SVM, can be devised.

  12. Validating soil phosphorus routines in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  13. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    PubMed Central

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  14. Theoretical and experimental investigation of architected core materials incorporating negative stiffness elements

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Ming; Keefe, Andrew; Carter, William B.; Henry, Christopher P.; McKnight, Geoff P.

    2014-04-01

    Structural assemblies incorporating negative stiffness elements have been shown to provide both tunable damping properties and simultaneous high stiffness and damping over prescribed displacement regions. In this paper we explore the design space for negative stiffness based assemblies using analytical modeling combined with finite element analysis. A simplified spring model demonstrates the effects of element stiffness, geometry, and preloads on the damping and stiffness performance. Simplified analytical models were validated for realistic structural implementations through finite element analysis. A series of complementary experiments was conducted to compare with modeling and determine the effects of each element on the system response. The measured damping performance follows the theoretical predictions obtained by analytical modeling. We applied these concepts to a novel sandwich core structure that exhibited combined stiffness and damping properties 8 times greater than existing foam core technologies.

  15. The Use of a Mesoscale Climate Model to Validate the Nocturnal Carbon Flux over a Forested Site

    NASA Astrophysics Data System (ADS)

    Werth, D.; Parker, M.; Kurzeja, R.; Leclerc, M.; Watson, T.

    2007-12-01

    The Savannah River National Laboratory is initiating a comprehensive carbon dioxide monitoring and modeling program in collaboration with the University of Georgia and the Brookhaven National Laboratory. One of the primary goals is to study the dynamics of carbon dioxide in the stable nocturnal boundary layer (NBL) over a forested area of the Savannah River Site in southwest South Carolina. In the nocturnal boundary layer (NBL), eddy flux correlation is less effective in determining the release of CO2 due to respiration. Theoretically, however, the flux can be inferred by measuring the build up of CO2 in the stable layer throughout the night. This method of monitoring the flux will be validated and studied in more detail with both observations and the results of a high-resolution regional climate model. The experiment will involve two phases. First, an artificial tracer will be released into the forest boundary layer and observed through an array of sensors and at a flux tower. The event will be simulated with the RAMS climate model run at very high resolution. Ideally, the tracer will remain trapped within the stable layer and accumulate at rates which will allow us to infer the release rate, and this should compare well to the actual release rate. If an unknown mechanism allows the tracer to escape, the model simulation would be used to reveal it. In the second phase, carbon fluxes will be measured overnight through accumulation in the overlying layer. The RAMS model will be coupled with the SiB carbon model to simulate the nocturnal cycle of carbon dynamics, and this will be compared to the data collected during the night. As with the tracer study, the NBL method of flux measurement will be validated against the model. The RAMS-SiB coupled model has been run over the SRS at high-resolution to simulate the NBL, and results from simulations of both phases of the project will be presented.

  16. A Methodology for Validation of High Resolution Combat Models

    DTIC Science & Technology

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  17. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    NASA Astrophysics Data System (ADS)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  18. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  19. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  20. A reduced theoretical model for estimating condensation effects in combustion-heated hypersonic tunnel

    NASA Astrophysics Data System (ADS)

    Lin, L.; Luo, X.; Qin, F.; Yang, J.

    2018-03-01

    As one of the combustion products of hydrocarbon fuels in a combustion-heated wind tunnel, water vapor may condense during the rapid expansion process, which will lead to a complex two-phase flow inside the wind tunnel and even change the design flow conditions at the nozzle exit. The coupling of the phase transition and the compressible flow makes the estimation of the condensation effects in such wind tunnels very difficult and time-consuming. In this work, a reduced theoretical model is developed to approximately compute the nozzle-exit conditions of a flow including real-gas and homogeneous condensation effects. Specifically, the conservation equations of the axisymmetric flow are first approximated in the quasi-one-dimensional way. Then, the complex process is split into two steps, i.e., a real-gas nozzle flow but excluding condensation, resulting in supersaturated nozzle-exit conditions, and a discontinuous jump at the end of the nozzle from the supersaturated state to a saturated state. Compared with two-dimensional numerical simulations implemented with a detailed condensation model, the reduced model predicts the flow parameters with good accuracy except for some deviations caused by the two-dimensional effect. Therefore, this reduced theoretical model can provide a fast, simple but also accurate estimation of the condensation effect in combustion-heated hypersonic tunnels.